00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v23.11" build number 1063 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3730 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.022 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.023 The recommended git tool is: git 00:00:00.024 using credential 00000000-0000-0000-0000-000000000002 00:00:00.026 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.038 Fetching changes from the remote Git repository 00:00:00.040 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.054 Using shallow fetch with depth 1 00:00:00.054 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.054 > git --version # timeout=10 00:00:00.071 > git --version # 'git version 2.39.2' 00:00:00.071 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.102 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.102 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.184 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.194 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.203 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:02.203 > git config core.sparsecheckout # timeout=10 00:00:02.214 > git read-tree -mu HEAD # timeout=10 00:00:02.229 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:02.244 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:02.244 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:02.436 [Pipeline] Start of Pipeline 00:00:02.449 [Pipeline] library 00:00:02.451 Loading library shm_lib@master 00:00:02.452 Library shm_lib@master is cached. Copying from home. 00:00:02.471 [Pipeline] node 00:00:02.489 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.491 [Pipeline] { 00:00:02.502 [Pipeline] catchError 00:00:02.504 [Pipeline] { 00:00:02.518 [Pipeline] wrap 00:00:02.527 [Pipeline] { 00:00:02.535 [Pipeline] stage 00:00:02.536 [Pipeline] { (Prologue) 00:00:02.744 [Pipeline] sh 00:00:03.029 + logger -p user.info -t JENKINS-CI 00:00:03.046 [Pipeline] echo 00:00:03.048 Node: WFP20 00:00:03.054 [Pipeline] sh 00:00:03.398 [Pipeline] setCustomBuildProperty 00:00:03.408 [Pipeline] echo 00:00:03.409 Cleanup processes 00:00:03.412 [Pipeline] sh 00:00:03.751 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.751 520561 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.762 [Pipeline] sh 00:00:04.043 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.043 ++ grep -v 'sudo pgrep' 00:00:04.043 ++ awk '{print $1}' 00:00:04.043 + sudo kill -9 00:00:04.043 + true 00:00:04.055 [Pipeline] cleanWs 00:00:04.063 [WS-CLEANUP] Deleting project workspace... 00:00:04.063 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.069 [WS-CLEANUP] done 00:00:04.073 [Pipeline] setCustomBuildProperty 00:00:04.082 [Pipeline] sh 00:00:04.362 + sudo git config --global --replace-all safe.directory '*' 00:00:04.456 [Pipeline] httpRequest 00:00:05.033 [Pipeline] echo 00:00:05.034 Sorcerer 10.211.164.20 is alive 00:00:05.042 [Pipeline] retry 00:00:05.044 [Pipeline] { 00:00:05.053 [Pipeline] httpRequest 00:00:05.057 HttpMethod: GET 00:00:05.057 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.058 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.061 Response Code: HTTP/1.1 200 OK 00:00:05.061 Success: Status code 200 is in the accepted range: 200,404 00:00:05.061 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.599 [Pipeline] } 00:00:05.617 [Pipeline] // retry 00:00:05.624 [Pipeline] sh 00:00:05.909 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.922 [Pipeline] httpRequest 00:00:06.698 [Pipeline] echo 00:00:06.699 Sorcerer 10.211.164.20 is alive 00:00:06.703 [Pipeline] retry 00:00:06.704 [Pipeline] { 00:00:06.711 [Pipeline] httpRequest 00:00:06.714 HttpMethod: GET 00:00:06.714 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:06.715 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:06.724 Response Code: HTTP/1.1 200 OK 00:00:06.725 Success: Status code 200 is in the accepted range: 200,404 00:00:06.725 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:33.800 [Pipeline] } 00:01:33.817 [Pipeline] // retry 00:01:33.824 [Pipeline] sh 00:01:34.114 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:36.679 [Pipeline] sh 00:01:36.966 + git -C spdk log --oneline -n5 00:01:36.966 c13c99a5e test: Various fixes for Fedora40 00:01:36.966 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:36.966 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:36.966 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:36.966 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:36.983 [Pipeline] withCredentials 00:01:36.994 > git --version # timeout=10 00:01:37.006 > git --version # 'git version 2.39.2' 00:01:37.024 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:37.026 [Pipeline] { 00:01:37.035 [Pipeline] retry 00:01:37.037 [Pipeline] { 00:01:37.051 [Pipeline] sh 00:01:37.335 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:37.918 [Pipeline] } 00:01:37.936 [Pipeline] // retry 00:01:37.940 [Pipeline] } 00:01:37.957 [Pipeline] // withCredentials 00:01:37.966 [Pipeline] httpRequest 00:01:38.283 [Pipeline] echo 00:01:38.284 Sorcerer 10.211.164.20 is alive 00:01:38.294 [Pipeline] retry 00:01:38.296 [Pipeline] { 00:01:38.309 [Pipeline] httpRequest 00:01:38.314 HttpMethod: GET 00:01:38.314 URL: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:38.315 Sending request to url: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:38.318 Response Code: HTTP/1.1 200 OK 00:01:38.318 Success: Status code 200 is in the accepted range: 200,404 00:01:38.319 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:40.455 [Pipeline] } 00:01:40.472 [Pipeline] // retry 00:01:40.480 [Pipeline] sh 00:01:40.767 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:42.161 [Pipeline] sh 00:01:42.449 + git -C dpdk log --oneline -n5 00:01:42.449 eeb0605f11 version: 23.11.0 00:01:42.449 238778122a doc: update release notes for 23.11 00:01:42.449 46aa6b3cfc doc: fix description of RSS features 00:01:42.449 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:42.449 7e421ae345 devtools: support skipping forbid rule check 00:01:42.460 [Pipeline] } 00:01:42.474 [Pipeline] // stage 00:01:42.483 [Pipeline] stage 00:01:42.485 [Pipeline] { (Prepare) 00:01:42.505 [Pipeline] writeFile 00:01:42.520 [Pipeline] sh 00:01:42.807 + logger -p user.info -t JENKINS-CI 00:01:42.820 [Pipeline] sh 00:01:43.106 + logger -p user.info -t JENKINS-CI 00:01:43.119 [Pipeline] sh 00:01:43.406 + cat autorun-spdk.conf 00:01:43.406 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:43.406 SPDK_RUN_UBSAN=1 00:01:43.406 SPDK_TEST_FUZZER=1 00:01:43.406 SPDK_TEST_FUZZER_SHORT=1 00:01:43.406 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:43.406 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:43.414 RUN_NIGHTLY=1 00:01:43.419 [Pipeline] readFile 00:01:43.442 [Pipeline] withEnv 00:01:43.444 [Pipeline] { 00:01:43.456 [Pipeline] sh 00:01:43.744 + set -ex 00:01:43.744 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:43.744 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:43.744 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:43.744 ++ SPDK_RUN_UBSAN=1 00:01:43.744 ++ SPDK_TEST_FUZZER=1 00:01:43.744 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:43.744 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:43.744 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:43.744 ++ RUN_NIGHTLY=1 00:01:43.744 + case $SPDK_TEST_NVMF_NICS in 00:01:43.744 + DRIVERS= 00:01:43.744 + [[ -n '' ]] 00:01:43.744 + exit 0 00:01:43.754 [Pipeline] } 00:01:43.767 [Pipeline] // withEnv 00:01:43.772 [Pipeline] } 00:01:43.785 [Pipeline] // stage 00:01:43.793 [Pipeline] catchError 00:01:43.794 [Pipeline] { 00:01:43.808 [Pipeline] timeout 00:01:43.808 Timeout set to expire in 30 min 00:01:43.810 [Pipeline] { 00:01:43.824 [Pipeline] stage 00:01:43.826 [Pipeline] { (Tests) 00:01:43.839 [Pipeline] sh 00:01:44.130 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:44.130 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:44.130 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:44.130 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:44.130 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:44.130 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:44.130 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:44.130 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:44.130 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:44.130 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:44.130 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:44.130 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:44.130 + source /etc/os-release 00:01:44.130 ++ NAME='Fedora Linux' 00:01:44.130 ++ VERSION='39 (Cloud Edition)' 00:01:44.130 ++ ID=fedora 00:01:44.130 ++ VERSION_ID=39 00:01:44.130 ++ VERSION_CODENAME= 00:01:44.130 ++ PLATFORM_ID=platform:f39 00:01:44.130 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:44.130 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:44.130 ++ LOGO=fedora-logo-icon 00:01:44.130 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:44.130 ++ HOME_URL=https://fedoraproject.org/ 00:01:44.130 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:44.130 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:44.130 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:44.130 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:44.130 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:44.130 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:44.130 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:44.130 ++ SUPPORT_END=2024-11-12 00:01:44.130 ++ VARIANT='Cloud Edition' 00:01:44.130 ++ VARIANT_ID=cloud 00:01:44.130 + uname -a 00:01:44.130 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:44.130 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:47.428 Hugepages 00:01:47.428 node hugesize free / total 00:01:47.428 node0 1048576kB 0 / 0 00:01:47.428 node0 2048kB 0 / 0 00:01:47.428 node1 1048576kB 0 / 0 00:01:47.428 node1 2048kB 0 / 0 00:01:47.429 00:01:47.429 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:47.429 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:47.429 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:47.429 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:47.429 + rm -f /tmp/spdk-ld-path 00:01:47.429 + source autorun-spdk.conf 00:01:47.429 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:47.429 ++ SPDK_RUN_UBSAN=1 00:01:47.429 ++ SPDK_TEST_FUZZER=1 00:01:47.429 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:47.429 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:47.429 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:47.429 ++ RUN_NIGHTLY=1 00:01:47.429 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:47.429 + [[ -n '' ]] 00:01:47.429 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:47.429 + for M in /var/spdk/build-*-manifest.txt 00:01:47.429 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:47.429 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:47.429 + for M in /var/spdk/build-*-manifest.txt 00:01:47.429 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:47.429 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:47.429 + for M in /var/spdk/build-*-manifest.txt 00:01:47.429 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:47.429 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:47.429 ++ uname 00:01:47.429 + [[ Linux == \L\i\n\u\x ]] 00:01:47.429 + sudo dmesg -T 00:01:47.429 + sudo dmesg --clear 00:01:47.429 + dmesg_pid=522042 00:01:47.429 + [[ Fedora Linux == FreeBSD ]] 00:01:47.429 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:47.429 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:47.429 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:47.429 + [[ -x /usr/src/fio-static/fio ]] 00:01:47.429 + export FIO_BIN=/usr/src/fio-static/fio 00:01:47.429 + FIO_BIN=/usr/src/fio-static/fio 00:01:47.429 + sudo dmesg -Tw 00:01:47.429 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:47.429 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:47.429 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:47.429 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:47.429 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:47.429 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:47.429 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:47.429 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:47.429 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:47.429 Test configuration: 00:01:47.429 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:47.429 SPDK_RUN_UBSAN=1 00:01:47.429 SPDK_TEST_FUZZER=1 00:01:47.429 SPDK_TEST_FUZZER_SHORT=1 00:01:47.429 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:47.429 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:47.429 RUN_NIGHTLY=1 10:48:45 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:47.429 10:48:45 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:47.429 10:48:45 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:47.429 10:48:45 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:47.429 10:48:45 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:47.429 10:48:45 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:47.429 10:48:45 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:47.429 10:48:45 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:47.429 10:48:45 -- paths/export.sh@5 -- $ export PATH 00:01:47.429 10:48:45 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:47.429 10:48:45 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:47.429 10:48:45 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:47.429 10:48:45 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1734342525.XXXXXX 00:01:47.429 10:48:45 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1734342525.efGf77 00:01:47.429 10:48:45 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:47.429 10:48:45 -- common/autobuild_common.sh@446 -- $ '[' -n v23.11 ']' 00:01:47.429 10:48:45 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:47.429 10:48:45 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:47.429 10:48:45 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:47.429 10:48:45 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:47.429 10:48:45 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:47.429 10:48:45 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:47.429 10:48:45 -- common/autotest_common.sh@10 -- $ set +x 00:01:47.429 10:48:45 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:47.429 10:48:45 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:47.429 10:48:45 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:47.429 10:48:45 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:47.429 10:48:45 -- spdk/autobuild.sh@16 -- $ date -u 00:01:47.429 Mon Dec 16 09:48:45 AM UTC 2024 00:01:47.429 10:48:45 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:47.429 LTS-67-gc13c99a5e 00:01:47.429 10:48:45 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:47.429 10:48:45 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:47.429 10:48:45 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:47.429 10:48:45 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:47.429 10:48:45 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:47.429 10:48:45 -- common/autotest_common.sh@10 -- $ set +x 00:01:47.429 ************************************ 00:01:47.429 START TEST ubsan 00:01:47.429 ************************************ 00:01:47.429 10:48:45 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:47.429 using ubsan 00:01:47.429 00:01:47.429 real 0m0.000s 00:01:47.429 user 0m0.000s 00:01:47.429 sys 0m0.000s 00:01:47.429 10:48:45 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:47.429 10:48:45 -- common/autotest_common.sh@10 -- $ set +x 00:01:47.429 ************************************ 00:01:47.429 END TEST ubsan 00:01:47.429 ************************************ 00:01:47.429 10:48:45 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:47.429 10:48:45 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:47.429 10:48:45 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:47.429 10:48:45 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:47.429 10:48:45 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:47.429 10:48:45 -- common/autotest_common.sh@10 -- $ set +x 00:01:47.429 ************************************ 00:01:47.429 START TEST build_native_dpdk 00:01:47.429 ************************************ 00:01:47.429 10:48:45 -- common/autotest_common.sh@1114 -- $ _build_native_dpdk 00:01:47.429 10:48:45 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:47.429 10:48:45 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:47.429 10:48:45 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:47.429 10:48:45 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:47.429 10:48:45 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:47.429 10:48:45 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:47.429 10:48:45 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:47.429 10:48:45 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:47.429 10:48:45 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:47.429 10:48:45 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:47.429 10:48:45 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:47.429 10:48:45 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:47.429 10:48:45 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:47.430 10:48:45 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:47.430 10:48:45 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:47.430 10:48:45 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:47.430 10:48:45 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:47.430 10:48:45 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:47.430 10:48:45 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:47.430 eeb0605f11 version: 23.11.0 00:01:47.430 238778122a doc: update release notes for 23.11 00:01:47.430 46aa6b3cfc doc: fix description of RSS features 00:01:47.430 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:47.430 7e421ae345 devtools: support skipping forbid rule check 00:01:47.430 10:48:45 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:47.430 10:48:45 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:47.430 10:48:45 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:47.430 10:48:45 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:47.430 10:48:45 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:47.430 10:48:45 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:47.430 10:48:45 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:47.430 10:48:45 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:47.430 10:48:45 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:47.430 10:48:45 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:47.430 10:48:45 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:47.430 10:48:45 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:47.430 10:48:45 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:47.430 10:48:45 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:47.430 10:48:45 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:47.430 10:48:45 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:47.430 10:48:45 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:47.430 10:48:45 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:47.430 10:48:45 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:47.430 10:48:45 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:47.430 10:48:45 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:47.430 10:48:45 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:47.430 10:48:45 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:47.430 10:48:45 -- scripts/common.sh@343 -- $ case "$op" in 00:01:47.430 10:48:45 -- scripts/common.sh@344 -- $ : 1 00:01:47.430 10:48:45 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:47.430 10:48:45 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:47.430 10:48:45 -- scripts/common.sh@364 -- $ decimal 23 00:01:47.430 10:48:45 -- scripts/common.sh@352 -- $ local d=23 00:01:47.430 10:48:45 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:47.430 10:48:45 -- scripts/common.sh@354 -- $ echo 23 00:01:47.430 10:48:45 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:47.430 10:48:45 -- scripts/common.sh@365 -- $ decimal 21 00:01:47.430 10:48:45 -- scripts/common.sh@352 -- $ local d=21 00:01:47.430 10:48:45 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:47.430 10:48:45 -- scripts/common.sh@354 -- $ echo 21 00:01:47.430 10:48:45 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:47.430 10:48:45 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:47.430 10:48:45 -- scripts/common.sh@366 -- $ return 1 00:01:47.430 10:48:45 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:47.430 patching file config/rte_config.h 00:01:47.430 Hunk #1 succeeded at 60 (offset 1 line). 00:01:47.430 10:48:45 -- common/autobuild_common.sh@176 -- $ lt 23.11.0 24.07.0 00:01:47.430 10:48:45 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:01:47.430 10:48:45 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:47.430 10:48:45 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:47.430 10:48:45 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:47.430 10:48:45 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:47.430 10:48:45 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:47.430 10:48:45 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:47.430 10:48:45 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:47.430 10:48:45 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:47.430 10:48:45 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:47.430 10:48:45 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:47.430 10:48:45 -- scripts/common.sh@343 -- $ case "$op" in 00:01:47.430 10:48:45 -- scripts/common.sh@344 -- $ : 1 00:01:47.430 10:48:45 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:47.430 10:48:45 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:47.430 10:48:45 -- scripts/common.sh@364 -- $ decimal 23 00:01:47.430 10:48:45 -- scripts/common.sh@352 -- $ local d=23 00:01:47.430 10:48:45 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:47.430 10:48:45 -- scripts/common.sh@354 -- $ echo 23 00:01:47.430 10:48:45 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:47.430 10:48:45 -- scripts/common.sh@365 -- $ decimal 24 00:01:47.430 10:48:45 -- scripts/common.sh@352 -- $ local d=24 00:01:47.430 10:48:45 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:47.430 10:48:45 -- scripts/common.sh@354 -- $ echo 24 00:01:47.430 10:48:45 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:47.430 10:48:45 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:47.430 10:48:45 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:47.430 10:48:45 -- scripts/common.sh@367 -- $ return 0 00:01:47.430 10:48:45 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:47.430 patching file lib/pcapng/rte_pcapng.c 00:01:47.430 10:48:45 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:47.430 10:48:45 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:47.430 10:48:45 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:47.430 10:48:45 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:47.430 10:48:45 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:52.713 The Meson build system 00:01:52.713 Version: 1.5.0 00:01:52.713 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:52.713 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:52.713 Build type: native build 00:01:52.713 Program cat found: YES (/usr/bin/cat) 00:01:52.713 Project name: DPDK 00:01:52.713 Project version: 23.11.0 00:01:52.713 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:52.713 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:52.713 Host machine cpu family: x86_64 00:01:52.713 Host machine cpu: x86_64 00:01:52.713 Message: ## Building in Developer Mode ## 00:01:52.713 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:52.713 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:52.713 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:52.713 Program python3 found: YES (/usr/bin/python3) 00:01:52.713 Program cat found: YES (/usr/bin/cat) 00:01:52.713 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:52.713 Compiler for C supports arguments -march=native: YES 00:01:52.713 Checking for size of "void *" : 8 00:01:52.713 Checking for size of "void *" : 8 (cached) 00:01:52.713 Library m found: YES 00:01:52.713 Library numa found: YES 00:01:52.713 Has header "numaif.h" : YES 00:01:52.713 Library fdt found: NO 00:01:52.713 Library execinfo found: NO 00:01:52.713 Has header "execinfo.h" : YES 00:01:52.713 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:52.713 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:52.713 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:52.713 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:52.713 Run-time dependency openssl found: YES 3.1.1 00:01:52.713 Run-time dependency libpcap found: YES 1.10.4 00:01:52.713 Has header "pcap.h" with dependency libpcap: YES 00:01:52.713 Compiler for C supports arguments -Wcast-qual: YES 00:01:52.713 Compiler for C supports arguments -Wdeprecated: YES 00:01:52.713 Compiler for C supports arguments -Wformat: YES 00:01:52.713 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:52.713 Compiler for C supports arguments -Wformat-security: NO 00:01:52.713 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:52.713 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:52.713 Compiler for C supports arguments -Wnested-externs: YES 00:01:52.713 Compiler for C supports arguments -Wold-style-definition: YES 00:01:52.713 Compiler for C supports arguments -Wpointer-arith: YES 00:01:52.713 Compiler for C supports arguments -Wsign-compare: YES 00:01:52.713 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:52.713 Compiler for C supports arguments -Wundef: YES 00:01:52.713 Compiler for C supports arguments -Wwrite-strings: YES 00:01:52.713 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:52.713 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:52.713 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:52.713 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:52.713 Program objdump found: YES (/usr/bin/objdump) 00:01:52.713 Compiler for C supports arguments -mavx512f: YES 00:01:52.713 Checking if "AVX512 checking" compiles: YES 00:01:52.713 Fetching value of define "__SSE4_2__" : 1 00:01:52.713 Fetching value of define "__AES__" : 1 00:01:52.713 Fetching value of define "__AVX__" : 1 00:01:52.713 Fetching value of define "__AVX2__" : 1 00:01:52.713 Fetching value of define "__AVX512BW__" : 1 00:01:52.713 Fetching value of define "__AVX512CD__" : 1 00:01:52.713 Fetching value of define "__AVX512DQ__" : 1 00:01:52.713 Fetching value of define "__AVX512F__" : 1 00:01:52.713 Fetching value of define "__AVX512VL__" : 1 00:01:52.713 Fetching value of define "__PCLMUL__" : 1 00:01:52.713 Fetching value of define "__RDRND__" : 1 00:01:52.713 Fetching value of define "__RDSEED__" : 1 00:01:52.713 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:52.713 Fetching value of define "__znver1__" : (undefined) 00:01:52.713 Fetching value of define "__znver2__" : (undefined) 00:01:52.713 Fetching value of define "__znver3__" : (undefined) 00:01:52.713 Fetching value of define "__znver4__" : (undefined) 00:01:52.713 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:52.713 Message: lib/log: Defining dependency "log" 00:01:52.713 Message: lib/kvargs: Defining dependency "kvargs" 00:01:52.713 Message: lib/telemetry: Defining dependency "telemetry" 00:01:52.713 Checking for function "getentropy" : NO 00:01:52.713 Message: lib/eal: Defining dependency "eal" 00:01:52.713 Message: lib/ring: Defining dependency "ring" 00:01:52.713 Message: lib/rcu: Defining dependency "rcu" 00:01:52.713 Message: lib/mempool: Defining dependency "mempool" 00:01:52.713 Message: lib/mbuf: Defining dependency "mbuf" 00:01:52.713 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:52.714 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:52.714 Compiler for C supports arguments -mpclmul: YES 00:01:52.714 Compiler for C supports arguments -maes: YES 00:01:52.714 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:52.714 Compiler for C supports arguments -mavx512bw: YES 00:01:52.714 Compiler for C supports arguments -mavx512dq: YES 00:01:52.714 Compiler for C supports arguments -mavx512vl: YES 00:01:52.714 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:52.714 Compiler for C supports arguments -mavx2: YES 00:01:52.714 Compiler for C supports arguments -mavx: YES 00:01:52.714 Message: lib/net: Defining dependency "net" 00:01:52.714 Message: lib/meter: Defining dependency "meter" 00:01:52.714 Message: lib/ethdev: Defining dependency "ethdev" 00:01:52.714 Message: lib/pci: Defining dependency "pci" 00:01:52.714 Message: lib/cmdline: Defining dependency "cmdline" 00:01:52.714 Message: lib/metrics: Defining dependency "metrics" 00:01:52.714 Message: lib/hash: Defining dependency "hash" 00:01:52.714 Message: lib/timer: Defining dependency "timer" 00:01:52.714 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:52.714 Message: lib/acl: Defining dependency "acl" 00:01:52.714 Message: lib/bbdev: Defining dependency "bbdev" 00:01:52.714 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:52.714 Run-time dependency libelf found: YES 0.191 00:01:52.714 Message: lib/bpf: Defining dependency "bpf" 00:01:52.714 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:52.714 Message: lib/compressdev: Defining dependency "compressdev" 00:01:52.714 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:52.714 Message: lib/distributor: Defining dependency "distributor" 00:01:52.714 Message: lib/dmadev: Defining dependency "dmadev" 00:01:52.714 Message: lib/efd: Defining dependency "efd" 00:01:52.714 Message: lib/eventdev: Defining dependency "eventdev" 00:01:52.714 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:52.714 Message: lib/gpudev: Defining dependency "gpudev" 00:01:52.714 Message: lib/gro: Defining dependency "gro" 00:01:52.714 Message: lib/gso: Defining dependency "gso" 00:01:52.714 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:52.714 Message: lib/jobstats: Defining dependency "jobstats" 00:01:52.714 Message: lib/latencystats: Defining dependency "latencystats" 00:01:52.714 Message: lib/lpm: Defining dependency "lpm" 00:01:52.714 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:52.714 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:52.714 Message: lib/member: Defining dependency "member" 00:01:52.714 Message: lib/pcapng: Defining dependency "pcapng" 00:01:52.714 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:52.714 Message: lib/power: Defining dependency "power" 00:01:52.714 Message: lib/rawdev: Defining dependency "rawdev" 00:01:52.714 Message: lib/regexdev: Defining dependency "regexdev" 00:01:52.714 Message: lib/mldev: Defining dependency "mldev" 00:01:52.714 Message: lib/rib: Defining dependency "rib" 00:01:52.714 Message: lib/reorder: Defining dependency "reorder" 00:01:52.714 Message: lib/sched: Defining dependency "sched" 00:01:52.714 Message: lib/security: Defining dependency "security" 00:01:52.714 Message: lib/stack: Defining dependency "stack" 00:01:52.714 Has header "linux/userfaultfd.h" : YES 00:01:52.714 Has header "linux/vduse.h" : YES 00:01:52.714 Message: lib/vhost: Defining dependency "vhost" 00:01:52.714 Message: lib/ipsec: Defining dependency "ipsec" 00:01:52.714 Message: lib/pdcp: Defining dependency "pdcp" 00:01:52.714 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:52.714 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:52.714 Message: lib/fib: Defining dependency "fib" 00:01:52.714 Message: lib/port: Defining dependency "port" 00:01:52.714 Message: lib/pdump: Defining dependency "pdump" 00:01:52.714 Message: lib/table: Defining dependency "table" 00:01:52.714 Message: lib/pipeline: Defining dependency "pipeline" 00:01:52.714 Message: lib/graph: Defining dependency "graph" 00:01:52.714 Message: lib/node: Defining dependency "node" 00:01:52.714 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:53.285 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:53.285 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:53.285 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:53.285 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:53.285 Compiler for C supports arguments -Wno-unused-value: YES 00:01:53.285 Compiler for C supports arguments -Wno-format: YES 00:01:53.285 Compiler for C supports arguments -Wno-format-security: YES 00:01:53.285 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:53.285 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:53.285 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:53.285 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:53.285 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:53.285 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:53.285 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:53.285 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:53.285 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:53.285 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:53.285 Has header "sys/epoll.h" : YES 00:01:53.285 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:53.285 Configuring doxy-api-html.conf using configuration 00:01:53.285 Configuring doxy-api-man.conf using configuration 00:01:53.285 Program mandb found: YES (/usr/bin/mandb) 00:01:53.285 Program sphinx-build found: NO 00:01:53.285 Configuring rte_build_config.h using configuration 00:01:53.285 Message: 00:01:53.285 ================= 00:01:53.285 Applications Enabled 00:01:53.285 ================= 00:01:53.285 00:01:53.285 apps: 00:01:53.285 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:53.285 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:53.285 test-pmd, test-regex, test-sad, test-security-perf, 00:01:53.285 00:01:53.285 Message: 00:01:53.285 ================= 00:01:53.285 Libraries Enabled 00:01:53.285 ================= 00:01:53.285 00:01:53.285 libs: 00:01:53.285 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:53.285 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:53.285 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:53.285 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:53.285 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:53.285 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:53.285 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:53.285 00:01:53.285 00:01:53.285 Message: 00:01:53.285 =============== 00:01:53.285 Drivers Enabled 00:01:53.285 =============== 00:01:53.285 00:01:53.285 common: 00:01:53.285 00:01:53.285 bus: 00:01:53.285 pci, vdev, 00:01:53.285 mempool: 00:01:53.285 ring, 00:01:53.285 dma: 00:01:53.285 00:01:53.285 net: 00:01:53.285 i40e, 00:01:53.285 raw: 00:01:53.285 00:01:53.285 crypto: 00:01:53.285 00:01:53.285 compress: 00:01:53.285 00:01:53.285 regex: 00:01:53.285 00:01:53.285 ml: 00:01:53.285 00:01:53.285 vdpa: 00:01:53.285 00:01:53.285 event: 00:01:53.285 00:01:53.285 baseband: 00:01:53.285 00:01:53.285 gpu: 00:01:53.285 00:01:53.285 00:01:53.285 Message: 00:01:53.285 ================= 00:01:53.285 Content Skipped 00:01:53.285 ================= 00:01:53.285 00:01:53.285 apps: 00:01:53.285 00:01:53.285 libs: 00:01:53.285 00:01:53.285 drivers: 00:01:53.285 common/cpt: not in enabled drivers build config 00:01:53.285 common/dpaax: not in enabled drivers build config 00:01:53.285 common/iavf: not in enabled drivers build config 00:01:53.285 common/idpf: not in enabled drivers build config 00:01:53.285 common/mvep: not in enabled drivers build config 00:01:53.285 common/octeontx: not in enabled drivers build config 00:01:53.285 bus/auxiliary: not in enabled drivers build config 00:01:53.285 bus/cdx: not in enabled drivers build config 00:01:53.285 bus/dpaa: not in enabled drivers build config 00:01:53.285 bus/fslmc: not in enabled drivers build config 00:01:53.285 bus/ifpga: not in enabled drivers build config 00:01:53.285 bus/platform: not in enabled drivers build config 00:01:53.285 bus/vmbus: not in enabled drivers build config 00:01:53.285 common/cnxk: not in enabled drivers build config 00:01:53.285 common/mlx5: not in enabled drivers build config 00:01:53.285 common/nfp: not in enabled drivers build config 00:01:53.285 common/qat: not in enabled drivers build config 00:01:53.285 common/sfc_efx: not in enabled drivers build config 00:01:53.285 mempool/bucket: not in enabled drivers build config 00:01:53.285 mempool/cnxk: not in enabled drivers build config 00:01:53.285 mempool/dpaa: not in enabled drivers build config 00:01:53.285 mempool/dpaa2: not in enabled drivers build config 00:01:53.285 mempool/octeontx: not in enabled drivers build config 00:01:53.285 mempool/stack: not in enabled drivers build config 00:01:53.285 dma/cnxk: not in enabled drivers build config 00:01:53.285 dma/dpaa: not in enabled drivers build config 00:01:53.285 dma/dpaa2: not in enabled drivers build config 00:01:53.285 dma/hisilicon: not in enabled drivers build config 00:01:53.285 dma/idxd: not in enabled drivers build config 00:01:53.285 dma/ioat: not in enabled drivers build config 00:01:53.285 dma/skeleton: not in enabled drivers build config 00:01:53.285 net/af_packet: not in enabled drivers build config 00:01:53.285 net/af_xdp: not in enabled drivers build config 00:01:53.285 net/ark: not in enabled drivers build config 00:01:53.285 net/atlantic: not in enabled drivers build config 00:01:53.285 net/avp: not in enabled drivers build config 00:01:53.285 net/axgbe: not in enabled drivers build config 00:01:53.285 net/bnx2x: not in enabled drivers build config 00:01:53.285 net/bnxt: not in enabled drivers build config 00:01:53.285 net/bonding: not in enabled drivers build config 00:01:53.285 net/cnxk: not in enabled drivers build config 00:01:53.285 net/cpfl: not in enabled drivers build config 00:01:53.285 net/cxgbe: not in enabled drivers build config 00:01:53.286 net/dpaa: not in enabled drivers build config 00:01:53.286 net/dpaa2: not in enabled drivers build config 00:01:53.286 net/e1000: not in enabled drivers build config 00:01:53.286 net/ena: not in enabled drivers build config 00:01:53.286 net/enetc: not in enabled drivers build config 00:01:53.286 net/enetfec: not in enabled drivers build config 00:01:53.286 net/enic: not in enabled drivers build config 00:01:53.286 net/failsafe: not in enabled drivers build config 00:01:53.286 net/fm10k: not in enabled drivers build config 00:01:53.286 net/gve: not in enabled drivers build config 00:01:53.286 net/hinic: not in enabled drivers build config 00:01:53.286 net/hns3: not in enabled drivers build config 00:01:53.286 net/iavf: not in enabled drivers build config 00:01:53.286 net/ice: not in enabled drivers build config 00:01:53.286 net/idpf: not in enabled drivers build config 00:01:53.286 net/igc: not in enabled drivers build config 00:01:53.286 net/ionic: not in enabled drivers build config 00:01:53.286 net/ipn3ke: not in enabled drivers build config 00:01:53.286 net/ixgbe: not in enabled drivers build config 00:01:53.286 net/mana: not in enabled drivers build config 00:01:53.286 net/memif: not in enabled drivers build config 00:01:53.286 net/mlx4: not in enabled drivers build config 00:01:53.286 net/mlx5: not in enabled drivers build config 00:01:53.286 net/mvneta: not in enabled drivers build config 00:01:53.286 net/mvpp2: not in enabled drivers build config 00:01:53.286 net/netvsc: not in enabled drivers build config 00:01:53.286 net/nfb: not in enabled drivers build config 00:01:53.286 net/nfp: not in enabled drivers build config 00:01:53.286 net/ngbe: not in enabled drivers build config 00:01:53.286 net/null: not in enabled drivers build config 00:01:53.286 net/octeontx: not in enabled drivers build config 00:01:53.286 net/octeon_ep: not in enabled drivers build config 00:01:53.286 net/pcap: not in enabled drivers build config 00:01:53.286 net/pfe: not in enabled drivers build config 00:01:53.286 net/qede: not in enabled drivers build config 00:01:53.286 net/ring: not in enabled drivers build config 00:01:53.286 net/sfc: not in enabled drivers build config 00:01:53.286 net/softnic: not in enabled drivers build config 00:01:53.286 net/tap: not in enabled drivers build config 00:01:53.286 net/thunderx: not in enabled drivers build config 00:01:53.286 net/txgbe: not in enabled drivers build config 00:01:53.286 net/vdev_netvsc: not in enabled drivers build config 00:01:53.286 net/vhost: not in enabled drivers build config 00:01:53.286 net/virtio: not in enabled drivers build config 00:01:53.286 net/vmxnet3: not in enabled drivers build config 00:01:53.286 raw/cnxk_bphy: not in enabled drivers build config 00:01:53.286 raw/cnxk_gpio: not in enabled drivers build config 00:01:53.286 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:53.286 raw/ifpga: not in enabled drivers build config 00:01:53.286 raw/ntb: not in enabled drivers build config 00:01:53.286 raw/skeleton: not in enabled drivers build config 00:01:53.286 crypto/armv8: not in enabled drivers build config 00:01:53.286 crypto/bcmfs: not in enabled drivers build config 00:01:53.286 crypto/caam_jr: not in enabled drivers build config 00:01:53.286 crypto/ccp: not in enabled drivers build config 00:01:53.286 crypto/cnxk: not in enabled drivers build config 00:01:53.286 crypto/dpaa_sec: not in enabled drivers build config 00:01:53.286 crypto/dpaa2_sec: not in enabled drivers build config 00:01:53.286 crypto/ipsec_mb: not in enabled drivers build config 00:01:53.286 crypto/mlx5: not in enabled drivers build config 00:01:53.286 crypto/mvsam: not in enabled drivers build config 00:01:53.286 crypto/nitrox: not in enabled drivers build config 00:01:53.286 crypto/null: not in enabled drivers build config 00:01:53.286 crypto/octeontx: not in enabled drivers build config 00:01:53.286 crypto/openssl: not in enabled drivers build config 00:01:53.286 crypto/scheduler: not in enabled drivers build config 00:01:53.286 crypto/uadk: not in enabled drivers build config 00:01:53.286 crypto/virtio: not in enabled drivers build config 00:01:53.286 compress/isal: not in enabled drivers build config 00:01:53.286 compress/mlx5: not in enabled drivers build config 00:01:53.286 compress/octeontx: not in enabled drivers build config 00:01:53.286 compress/zlib: not in enabled drivers build config 00:01:53.286 regex/mlx5: not in enabled drivers build config 00:01:53.286 regex/cn9k: not in enabled drivers build config 00:01:53.286 ml/cnxk: not in enabled drivers build config 00:01:53.286 vdpa/ifc: not in enabled drivers build config 00:01:53.286 vdpa/mlx5: not in enabled drivers build config 00:01:53.286 vdpa/nfp: not in enabled drivers build config 00:01:53.286 vdpa/sfc: not in enabled drivers build config 00:01:53.286 event/cnxk: not in enabled drivers build config 00:01:53.286 event/dlb2: not in enabled drivers build config 00:01:53.286 event/dpaa: not in enabled drivers build config 00:01:53.286 event/dpaa2: not in enabled drivers build config 00:01:53.286 event/dsw: not in enabled drivers build config 00:01:53.286 event/opdl: not in enabled drivers build config 00:01:53.286 event/skeleton: not in enabled drivers build config 00:01:53.286 event/sw: not in enabled drivers build config 00:01:53.286 event/octeontx: not in enabled drivers build config 00:01:53.286 baseband/acc: not in enabled drivers build config 00:01:53.286 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:53.286 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:53.286 baseband/la12xx: not in enabled drivers build config 00:01:53.286 baseband/null: not in enabled drivers build config 00:01:53.286 baseband/turbo_sw: not in enabled drivers build config 00:01:53.286 gpu/cuda: not in enabled drivers build config 00:01:53.286 00:01:53.286 00:01:53.286 Build targets in project: 217 00:01:53.286 00:01:53.286 DPDK 23.11.0 00:01:53.286 00:01:53.286 User defined options 00:01:53.286 libdir : lib 00:01:53.286 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:53.286 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:53.286 c_link_args : 00:01:53.286 enable_docs : false 00:01:53.286 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:53.286 enable_kmods : false 00:01:53.286 machine : native 00:01:53.286 tests : false 00:01:53.286 00:01:53.286 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:53.286 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:53.286 10:48:51 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:53.286 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:53.553 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:53.553 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:53.553 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:53.553 [4/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:53.553 [5/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:53.553 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:53.553 [7/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:53.553 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:53.553 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:53.553 [10/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:53.553 [11/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:53.553 [12/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:53.816 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:53.816 [14/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:53.816 [15/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:53.816 [16/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:53.816 [17/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:53.816 [18/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:53.816 [19/707] Linking static target lib/librte_kvargs.a 00:01:53.816 [20/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:53.816 [21/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:53.816 [22/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:53.816 [23/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:53.816 [24/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:53.816 [25/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:53.816 [26/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:53.816 [27/707] Linking static target lib/librte_pci.a 00:01:53.816 [28/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:53.816 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:53.816 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:53.816 [31/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:53.816 [32/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:53.816 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:53.816 [34/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:53.816 [35/707] Linking static target lib/librte_log.a 00:01:54.077 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:54.077 [37/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.077 [38/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.077 [39/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:54.077 [40/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:54.077 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:54.077 [42/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:54.077 [43/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:54.077 [44/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:54.077 [45/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:54.077 [46/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:54.340 [47/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:54.340 [48/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:54.340 [49/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:54.340 [50/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:54.340 [51/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:54.340 [52/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:54.340 [53/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:54.340 [54/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:54.340 [55/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:54.340 [56/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:54.340 [57/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:54.340 [58/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:54.340 [59/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:54.340 [60/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:54.340 [61/707] Linking static target lib/librte_meter.a 00:01:54.340 [62/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:54.340 [63/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:54.340 [64/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:54.340 [65/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:54.340 [66/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:54.340 [67/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:54.340 [68/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:54.340 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:54.340 [70/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:54.340 [71/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:54.340 [72/707] Linking static target lib/librte_ring.a 00:01:54.340 [73/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:54.340 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:54.340 [75/707] Linking static target lib/librte_cmdline.a 00:01:54.340 [76/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:54.340 [77/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:54.340 [78/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:54.340 [79/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:54.340 [80/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:54.340 [81/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:54.340 [82/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:54.340 [83/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:54.340 [84/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:54.340 [85/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:54.340 [86/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:54.340 [87/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:54.340 [88/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:54.340 [89/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:54.340 [90/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:54.340 [91/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:54.340 [92/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:54.340 [93/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:54.340 [94/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:54.340 [95/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:54.340 [96/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:54.340 [97/707] Linking static target lib/librte_metrics.a 00:01:54.340 [98/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:54.340 [99/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:54.340 [100/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:54.340 [101/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:54.340 [102/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:54.340 [103/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:54.340 [104/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:54.340 [105/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:54.340 [106/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:54.605 [107/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:54.605 [108/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:54.605 [109/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:54.605 [110/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:54.605 [111/707] Linking static target lib/librte_bitratestats.a 00:01:54.605 [112/707] Linking static target lib/librte_net.a 00:01:54.605 [113/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:54.605 [114/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:54.605 [115/707] Linking static target lib/librte_cfgfile.a 00:01:54.605 [116/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:54.605 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:54.605 [118/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.605 [119/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:54.605 [120/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:54.605 [121/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:54.605 [122/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:54.605 [123/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:54.605 [124/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:54.605 [125/707] Linking target lib/librte_log.so.24.0 00:01:54.605 [126/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:54.605 [127/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.605 [128/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:54.605 [129/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:54.605 [130/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:54.605 [131/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:54.605 [132/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:54.868 [133/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:54.868 [134/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:54.868 [135/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.868 [136/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:54.868 [137/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:54.868 [138/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:54.868 [139/707] Linking static target lib/librte_timer.a 00:01:54.868 [140/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:54.868 [141/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:54.868 [142/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:54.868 [143/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:54.868 [144/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:54.868 [145/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.868 [146/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:54.868 [147/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:54.868 [148/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:54.868 [149/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:54.868 [150/707] Linking static target lib/librte_mempool.a 00:01:54.868 [151/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.868 [152/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:54.868 [153/707] Linking target lib/librte_kvargs.so.24.0 00:01:54.868 [154/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:54.868 [155/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:54.868 [156/707] Linking static target lib/librte_bbdev.a 00:01:54.868 [157/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:54.868 [158/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:54.868 [159/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:54.868 [160/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:54.868 [161/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:54.868 [162/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:55.140 [163/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:55.140 [164/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.140 [165/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:55.140 [166/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:55.140 [167/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:55.140 [168/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:55.140 [169/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:55.140 [170/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:55.140 [171/707] Linking static target lib/librte_jobstats.a 00:01:55.140 [172/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:55.140 [173/707] Linking static target lib/librte_compressdev.a 00:01:55.140 [174/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:55.140 [175/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:55.140 [176/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:55.140 [177/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.140 [178/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:55.140 [179/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:55.140 [180/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:55.140 [181/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:55.140 [182/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:55.140 [183/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:55.140 [184/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:55.140 [185/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:55.140 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:55.140 [187/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:55.140 [188/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:55.140 [189/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:55.140 [190/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:55.140 [191/707] Linking static target lib/librte_dispatcher.a 00:01:55.140 [192/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:55.140 [193/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:55.140 [194/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:55.140 [195/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:55.140 [196/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:55.402 [197/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:55.402 [198/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:55.402 [199/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:55.402 [200/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:55.402 [201/707] Linking static target lib/librte_latencystats.a 00:01:55.402 [202/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:55.402 [203/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:55.402 [204/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:55.402 [205/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:55.402 [206/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:55.402 [207/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:55.402 [208/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:55.402 [209/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:55.402 [210/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:55.402 [211/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:55.402 [212/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.402 [213/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:55.402 [214/707] Linking static target lib/librte_rcu.a 00:01:55.402 [215/707] Linking static target lib/librte_telemetry.a 00:01:55.402 [216/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:55.402 [217/707] Linking static target lib/librte_gro.a 00:01:55.402 [218/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:55.402 [219/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:55.402 [220/707] Linking static target lib/librte_eal.a 00:01:55.402 [221/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:55.402 [222/707] Linking static target lib/librte_stack.a 00:01:55.402 [223/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:55.402 [224/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:55.402 [225/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:55.402 [226/707] Linking static target lib/librte_dmadev.a 00:01:55.402 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:55.402 [228/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:55.402 [229/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:55.402 [230/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:55.402 [231/707] Linking static target lib/librte_regexdev.a 00:01:55.402 [232/707] Linking static target lib/librte_gpudev.a 00:01:55.402 [233/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:55.402 [234/707] Linking static target lib/librte_distributor.a 00:01:55.402 [235/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:55.402 [236/707] Linking static target lib/librte_gso.a 00:01:55.402 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:55.402 [238/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:55.402 [239/707] Linking static target lib/librte_mldev.a 00:01:55.402 [240/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:55.402 [241/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.402 [242/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:55.402 [243/707] Linking static target lib/librte_rawdev.a 00:01:55.402 [244/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:55.667 [245/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:55.667 [246/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:55.667 [247/707] Linking static target lib/librte_power.a 00:01:55.667 [248/707] Linking static target lib/librte_mbuf.a 00:01:55.667 [249/707] Linking static target lib/librte_ip_frag.a 00:01:55.667 [250/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:55.667 [251/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:55.667 [252/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:55.667 [253/707] Linking static target lib/librte_pcapng.a 00:01:55.667 [254/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:55.667 [255/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.667 [256/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:55.667 [257/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.667 [258/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:55.667 [259/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:55.667 [260/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:55.668 [261/707] Linking static target lib/librte_reorder.a 00:01:55.668 [262/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:55.668 [263/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:55.668 [264/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:55.668 [265/707] Linking static target lib/librte_bpf.a 00:01:55.668 [266/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:55.668 [267/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.668 [268/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:55.668 [269/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:55.668 [270/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:55.668 [271/707] Linking static target lib/librte_security.a 00:01:55.668 [272/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.668 [273/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.668 [274/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:55.668 [275/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.933 [276/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:55.933 [277/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:55.933 [278/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:55.933 [279/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:55.933 [280/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:55.933 [281/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:55.933 [282/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.933 [283/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.933 [284/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.933 [285/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.933 [286/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:55.933 [287/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:55.933 [288/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:55.933 [289/707] Linking static target lib/librte_lpm.a 00:01:55.933 [290/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:55.933 [291/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:55.933 [292/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.933 [293/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:55.933 [294/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:55.933 [295/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:55.933 [296/707] Linking static target lib/librte_rib.a 00:01:55.933 [297/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.934 [298/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:55.934 [299/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:55.934 [300/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:55.934 [301/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.934 [302/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.934 [303/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.195 [304/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:56.195 [305/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:56.195 [306/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:56.195 [307/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:56.195 [308/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:56.195 [309/707] Linking target lib/librte_telemetry.so.24.0 00:01:56.195 [310/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:56.195 [311/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:56.195 [312/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:56.195 [313/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:56.195 [314/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.195 [315/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:56.195 [316/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:56.195 [317/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.195 [318/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:56.195 [319/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:56.195 [320/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:56.195 [321/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:56.195 [322/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:56.195 [323/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:56.195 [324/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:56.195 [325/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.195 [326/707] Linking static target lib/librte_efd.a 00:01:56.195 [327/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:56.195 [328/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:56.461 [329/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:56.461 [330/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:56.461 [331/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:56.461 [332/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:56.461 [333/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:56.461 [334/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:56.461 [335/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:56.461 [336/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:56.461 [337/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:56.461 [338/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.461 [339/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:56.461 [340/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:56.461 [341/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:56.461 [342/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.461 [343/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:56.461 [344/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.461 [345/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:56.461 [346/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:56.461 [347/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:56.461 [348/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:56.461 [349/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:56.461 [350/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:56.461 [351/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:56.461 [352/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:56.461 [353/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.727 [354/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:56.727 [355/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.727 [356/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:56.727 [357/707] Linking static target lib/librte_fib.a 00:01:56.727 [358/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:56.727 [359/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:56.727 [360/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:56.727 [361/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:56.727 [362/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.727 [363/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:56.727 [364/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:56.727 [365/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:56.727 [366/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:56.727 [367/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:56.727 [368/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:56.727 [369/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:56.727 [370/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:56.727 [371/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.727 [372/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:56.727 [373/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:56.727 [374/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:56.727 [375/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:56.727 [376/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:56.727 [377/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:56.727 [378/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.986 [379/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:56.986 [380/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:56.986 [381/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:56.986 [382/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:56.986 [383/707] Linking static target lib/librte_graph.a 00:01:56.986 [384/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:56.986 [385/707] Linking static target lib/librte_pdump.a 00:01:56.986 [386/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:56.986 [387/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:56.986 [388/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:56.986 [389/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:56.986 [390/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:56.986 [391/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:56.986 [392/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:56.986 [393/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:56.986 [394/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:56.986 [395/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:56.986 [396/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:56.986 [397/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:56.986 [398/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:56.986 [399/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:56.986 [400/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:56.986 [401/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:56.986 [402/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:56.986 [403/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:57.249 [404/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:57.249 [405/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:57.249 [406/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:57.249 [407/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:57.249 [408/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:57.249 [409/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:57.249 [410/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:57.249 [411/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:57.249 [412/707] Linking static target drivers/librte_bus_vdev.a 00:01:57.249 [413/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.249 [414/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:57.249 [415/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:57.249 [416/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:57.249 [417/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:57.249 [418/707] Linking static target lib/librte_table.a 00:01:57.249 [419/707] Linking static target lib/librte_sched.a 00:01:57.249 [420/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:57.249 [421/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:57.249 [422/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:57.249 [423/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:57.249 [424/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:57.249 [425/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:57.249 [426/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:57.249 [427/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:57.510 [428/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.510 [429/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:57.510 [430/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:57.510 [431/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:57.510 [432/707] Linking static target lib/librte_cryptodev.a 00:01:57.510 [433/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:57.510 [434/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:57.510 [435/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:57.510 [436/707] Linking static target drivers/librte_bus_pci.a 00:01:57.510 [437/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:57.510 [438/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:57.510 [439/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:57.510 [440/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:57.510 [441/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:57.510 [442/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:57.510 [443/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:57.510 [444/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:57.510 [445/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:57.510 [446/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:57.510 [447/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:57.510 [448/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.773 [449/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:57.773 [450/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.773 [451/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:57.773 [452/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:57.773 [453/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:57.773 [454/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:57.773 [455/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:57.773 [456/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:57.773 [457/707] Linking static target lib/librte_ipsec.a 00:01:57.773 [458/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:57.773 [459/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:57.773 [460/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:57.773 [461/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:57.773 [462/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:57.773 [463/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:57.773 [464/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:57.773 [465/707] Linking static target lib/librte_member.a 00:01:57.773 [466/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:57.773 [467/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:57.773 [468/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:57.773 [469/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:57.773 [470/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:57.773 [471/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:57.773 [472/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:57.773 [473/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:57.773 [474/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:57.773 [475/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:57.773 [476/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:57.773 [477/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:57.773 [478/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:57.773 [479/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:57.773 [480/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.773 [481/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:58.033 [482/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:58.033 [483/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.033 [484/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:58.033 [485/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:58.033 [486/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:58.033 [487/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:58.033 [488/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:58.033 [489/707] Linking static target lib/librte_pdcp.a 00:01:58.033 [490/707] Linking static target lib/librte_node.a 00:01:58.033 [491/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:58.033 [492/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:58.033 [493/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:58.033 [494/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:58.033 [495/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:58.033 [496/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:58.033 [497/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:58.033 [498/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:58.033 [499/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:58.033 [500/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:58.033 [501/707] Linking static target lib/librte_hash.a 00:01:58.033 [502/707] Linking static target drivers/librte_mempool_ring.a 00:01:58.033 [503/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:58.033 [504/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:58.033 [505/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:58.033 [506/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:58.033 [507/707] Linking static target lib/librte_port.a 00:01:58.033 [508/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:58.033 [509/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.033 [510/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:58.033 [511/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:58.033 [512/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.033 [513/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:58.033 [514/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:58.033 [515/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.034 [516/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:58.034 [517/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:58.294 [518/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:58.294 [519/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:58.294 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:58.294 [521/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:58.294 [522/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:58.294 [523/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.294 [524/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:58.294 [525/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:58.294 [526/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:58.294 [527/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:58.294 [528/707] Linking static target lib/acl/libavx2_tmp.a 00:01:58.294 [529/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:58.294 [530/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:58.294 [531/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:58.294 [532/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:58.294 [533/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.294 [534/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:58.294 [535/707] Linking static target lib/librte_eventdev.a 00:01:58.294 [536/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:58.294 [537/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:58.294 [538/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.294 [539/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:58.294 [540/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:58.294 [541/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:58.294 [542/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:58.294 [543/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:58.294 [544/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:58.554 [545/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:58.554 [546/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:58.554 [547/707] Linking static target lib/librte_acl.a 00:01:58.554 [548/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:58.554 [549/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:58.554 [550/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:58.554 [551/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:58.554 [552/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:58.554 [553/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:58.554 [554/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:58.555 [555/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:58.555 [556/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:58.555 [557/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:58.555 [558/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:58.555 [559/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:58.815 [560/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:58.815 [561/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:58.815 [562/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:58.815 [563/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:58.815 [564/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:58.815 [565/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:58.815 [566/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.815 [567/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.815 [568/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.815 [569/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:59.076 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:59.076 [571/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:59.336 [572/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.336 [573/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:59.336 [574/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:59.336 [575/707] Linking static target lib/librte_ethdev.a 00:01:59.597 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:59.597 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:59.858 [578/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:59.858 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:00.118 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:00.688 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:00.688 [582/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:00.688 [583/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:00.948 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:00.948 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:00.948 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:00.948 [587/707] Linking static target drivers/librte_net_i40e.a 00:02:01.207 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:01.777 [589/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.037 [590/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:02.037 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.606 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:07.881 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.881 [594/707] Linking target lib/librte_eal.so.24.0 00:02:07.881 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:07.881 [596/707] Linking target lib/librte_timer.so.24.0 00:02:07.881 [597/707] Linking target lib/librte_jobstats.so.24.0 00:02:07.881 [598/707] Linking target lib/librte_rawdev.so.24.0 00:02:07.881 [599/707] Linking target lib/librte_cfgfile.so.24.0 00:02:07.881 [600/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:07.881 [601/707] Linking target lib/librte_ring.so.24.0 00:02:07.881 [602/707] Linking target lib/librte_meter.so.24.0 00:02:07.881 [603/707] Linking target lib/librte_pci.so.24.0 00:02:07.881 [604/707] Linking target lib/librte_dmadev.so.24.0 00:02:07.881 [605/707] Linking target lib/librte_stack.so.24.0 00:02:07.881 [606/707] Linking target lib/librte_acl.so.24.0 00:02:07.881 [607/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:07.881 [608/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:07.881 [609/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:07.881 [610/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:07.881 [611/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:07.881 [612/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:07.881 [613/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:07.881 [614/707] Linking target lib/librte_mempool.so.24.0 00:02:07.881 [615/707] Linking target lib/librte_rcu.so.24.0 00:02:07.881 [616/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:07.881 [617/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:07.881 [618/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:07.881 [619/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.881 [620/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:07.881 [621/707] Linking target lib/librte_rib.so.24.0 00:02:07.881 [622/707] Linking target lib/librte_mbuf.so.24.0 00:02:07.881 [623/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:08.141 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:08.141 [625/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:08.141 [626/707] Linking target lib/librte_gpudev.so.24.0 00:02:08.141 [627/707] Linking target lib/librte_distributor.so.24.0 00:02:08.141 [628/707] Linking target lib/librte_compressdev.so.24.0 00:02:08.141 [629/707] Linking target lib/librte_regexdev.so.24.0 00:02:08.141 [630/707] Linking target lib/librte_reorder.so.24.0 00:02:08.141 [631/707] Linking target lib/librte_bbdev.so.24.0 00:02:08.141 [632/707] Linking target lib/librte_mldev.so.24.0 00:02:08.141 [633/707] Linking target lib/librte_net.so.24.0 00:02:08.141 [634/707] Linking target lib/librte_sched.so.24.0 00:02:08.141 [635/707] Linking target lib/librte_cryptodev.so.24.0 00:02:08.141 [636/707] Linking target lib/librte_fib.so.24.0 00:02:08.141 [637/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:08.141 [638/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:08.141 [639/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:08.141 [640/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:08.141 [641/707] Linking target lib/librte_security.so.24.0 00:02:08.141 [642/707] Linking target lib/librte_hash.so.24.0 00:02:08.141 [643/707] Linking target lib/librte_cmdline.so.24.0 00:02:08.141 [644/707] Linking target lib/librte_ethdev.so.24.0 00:02:08.400 [645/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:08.400 [646/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:08.400 [647/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:08.400 [648/707] Linking target lib/librte_pdcp.so.24.0 00:02:08.400 [649/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:08.400 [650/707] Linking target lib/librte_ipsec.so.24.0 00:02:08.400 [651/707] Linking target lib/librte_efd.so.24.0 00:02:08.400 [652/707] Linking target lib/librte_member.so.24.0 00:02:08.400 [653/707] Linking target lib/librte_lpm.so.24.0 00:02:08.400 [654/707] Linking target lib/librte_metrics.so.24.0 00:02:08.400 [655/707] Linking static target lib/librte_pipeline.a 00:02:08.400 [656/707] Linking target lib/librte_pcapng.so.24.0 00:02:08.400 [657/707] Linking target lib/librte_gso.so.24.0 00:02:08.400 [658/707] Linking target lib/librte_bpf.so.24.0 00:02:08.400 [659/707] Linking target lib/librte_ip_frag.so.24.0 00:02:08.400 [660/707] Linking target lib/librte_gro.so.24.0 00:02:08.400 [661/707] Linking target lib/librte_power.so.24.0 00:02:08.400 [662/707] Linking target lib/librte_eventdev.so.24.0 00:02:08.400 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:08.659 [664/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:08.659 [665/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:08.659 [666/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:08.659 [667/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:08.659 [668/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:08.659 [669/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:08.659 [670/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:08.659 [671/707] Linking target lib/librte_bitratestats.so.24.0 00:02:08.659 [672/707] Linking target lib/librte_latencystats.so.24.0 00:02:08.659 [673/707] Linking target lib/librte_dispatcher.so.24.0 00:02:08.659 [674/707] Linking target lib/librte_pdump.so.24.0 00:02:08.659 [675/707] Linking target lib/librte_graph.so.24.0 00:02:08.659 [676/707] Linking target lib/librte_port.so.24.0 00:02:08.659 [677/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:08.919 [678/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:08.919 [679/707] Linking target lib/librte_node.so.24.0 00:02:08.919 [680/707] Linking target lib/librte_table.so.24.0 00:02:08.919 [681/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:08.919 [682/707] Linking static target lib/librte_vhost.a 00:02:08.919 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:09.490 [684/707] Linking target app/dpdk-test-acl 00:02:09.490 [685/707] Linking target app/dpdk-test-dma-perf 00:02:09.490 [686/707] Linking target app/dpdk-test-pipeline 00:02:09.490 [687/707] Linking target app/dpdk-test-gpudev 00:02:09.490 [688/707] Linking target app/dpdk-test-cmdline 00:02:09.490 [689/707] Linking target app/dpdk-pdump 00:02:09.490 [690/707] Linking target app/dpdk-dumpcap 00:02:09.490 [691/707] Linking target app/dpdk-test-sad 00:02:09.490 [692/707] Linking target app/dpdk-test-flow-perf 00:02:09.490 [693/707] Linking target app/dpdk-test-mldev 00:02:09.490 [694/707] Linking target app/dpdk-graph 00:02:09.490 [695/707] Linking target app/dpdk-test-security-perf 00:02:09.490 [696/707] Linking target app/dpdk-proc-info 00:02:09.490 [697/707] Linking target app/dpdk-test-compress-perf 00:02:09.490 [698/707] Linking target app/dpdk-test-crypto-perf 00:02:09.490 [699/707] Linking target app/dpdk-test-fib 00:02:09.490 [700/707] Linking target app/dpdk-test-regex 00:02:09.490 [701/707] Linking target app/dpdk-test-eventdev 00:02:09.490 [702/707] Linking target app/dpdk-test-bbdev 00:02:09.490 [703/707] Linking target app/dpdk-testpmd 00:02:10.873 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.133 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:14.421 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.421 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:14.421 10:49:12 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:14.421 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:14.421 [0/1] Installing files. 00:02:14.684 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.684 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.685 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.686 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:14.687 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.688 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.689 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:14.690 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:14.690 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.690 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.691 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.953 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.953 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.953 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.953 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.953 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.953 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.954 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.955 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.956 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:14.957 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:14.957 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:14.957 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:14.957 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:14.957 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:14.957 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:14.957 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:14.957 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:14.957 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:14.957 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:14.957 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:14.957 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:14.957 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:14.957 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:14.957 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:14.957 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:14.957 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:14.957 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:14.957 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:14.957 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:14.957 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:14.957 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:14.957 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:14.957 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:14.957 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:14.957 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:14.957 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:14.957 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:14.957 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:14.957 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:14.957 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:14.957 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:14.958 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:14.958 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:14.958 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:14.958 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:14.958 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:14.958 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:14.958 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:14.958 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:14.958 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:14.958 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:14.958 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:14.958 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:14.958 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:14.958 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:14.958 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:14.958 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:14.958 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:14.958 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:14.958 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:14.958 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:14.958 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:14.958 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:14.958 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:14.958 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:14.958 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:14.958 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:14.958 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:14.958 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:14.958 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:14.958 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:14.958 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:14.958 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:14.958 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:14.958 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:14.958 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:14.958 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:14.958 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:14.958 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:14.958 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:14.958 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:14.958 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:14.958 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:14.958 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:14.958 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:14.958 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:14.958 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:14.958 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:14.958 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:14.958 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:14.958 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:14.958 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:14.958 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:14.958 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:14.958 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:14.958 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:14.958 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:14.958 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:14.958 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:14.958 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:14.958 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:14.958 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:14.958 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:14.958 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:14.958 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:14.958 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:14.958 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:14.958 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:14.958 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:14.958 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:14.958 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:14.958 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:14.958 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:14.958 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:14.958 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:14.958 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:14.958 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:14.958 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:14.958 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:14.958 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:14.958 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:14.958 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:14.958 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:14.958 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:14.958 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:14.958 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:14.958 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:14.958 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:14.958 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:14.958 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:14.958 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:14.958 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:14.958 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:14.958 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:14.958 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:14.958 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:14.958 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:14.958 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:14.958 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:14.958 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:14.959 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:14.959 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:14.959 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:14.959 10:49:13 -- common/autobuild_common.sh@192 -- $ uname -s 00:02:14.959 10:49:13 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:14.959 10:49:13 -- common/autobuild_common.sh@203 -- $ cat 00:02:14.959 10:49:13 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:14.959 00:02:14.959 real 0m27.749s 00:02:14.959 user 8m2.573s 00:02:14.959 sys 2m26.955s 00:02:14.959 10:49:13 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:14.959 10:49:13 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.959 ************************************ 00:02:14.959 END TEST build_native_dpdk 00:02:14.959 ************************************ 00:02:15.218 10:49:13 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:15.218 10:49:13 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:15.218 10:49:13 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:15.218 10:49:13 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:15.218 10:49:13 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:15.218 10:49:13 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:02:15.218 10:49:13 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:15.218 10:49:13 -- common/autotest_common.sh@10 -- $ set +x 00:02:15.218 ************************************ 00:02:15.218 START TEST autobuild_llvm_precompile 00:02:15.218 ************************************ 00:02:15.218 10:49:13 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:02:15.218 10:49:13 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:15.218 10:49:13 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:15.218 Target: x86_64-redhat-linux-gnu 00:02:15.218 Thread model: posix 00:02:15.218 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:15.218 10:49:13 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:15.218 10:49:13 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:15.218 10:49:13 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:15.218 10:49:13 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:15.218 10:49:13 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:15.218 10:49:13 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:15.218 10:49:13 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:15.218 10:49:13 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:15.218 10:49:13 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:15.218 10:49:13 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:15.477 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:15.477 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.477 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.736 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:15.995 Using 'verbs' RDMA provider 00:02:31.441 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:43.650 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:43.650 Creating mk/config.mk...done. 00:02:43.650 Creating mk/cc.flags.mk...done. 00:02:43.650 Type 'make' to build. 00:02:43.650 00:02:43.650 real 0m28.056s 00:02:43.650 user 0m12.313s 00:02:43.650 sys 0m15.028s 00:02:43.650 10:49:41 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:43.650 10:49:41 -- common/autotest_common.sh@10 -- $ set +x 00:02:43.650 ************************************ 00:02:43.650 END TEST autobuild_llvm_precompile 00:02:43.650 ************************************ 00:02:43.650 10:49:41 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:43.650 10:49:41 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:43.650 10:49:41 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:43.650 10:49:41 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:43.650 10:49:41 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:43.650 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:43.650 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:43.650 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:43.650 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:44.218 Using 'verbs' RDMA provider 00:02:56.995 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:03:09.200 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:03:09.200 Creating mk/config.mk...done. 00:03:09.200 Creating mk/cc.flags.mk...done. 00:03:09.200 Type 'make' to build. 00:03:09.200 10:50:06 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:03:09.200 10:50:06 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:03:09.200 10:50:06 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:03:09.200 10:50:06 -- common/autotest_common.sh@10 -- $ set +x 00:03:09.200 ************************************ 00:03:09.200 START TEST make 00:03:09.200 ************************************ 00:03:09.200 10:50:06 -- common/autotest_common.sh@1114 -- $ make -j112 00:03:09.200 make[1]: Nothing to be done for 'all'. 00:03:09.459 The Meson build system 00:03:09.459 Version: 1.5.0 00:03:09.459 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:09.459 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:09.459 Build type: native build 00:03:09.459 Project name: libvfio-user 00:03:09.459 Project version: 0.0.1 00:03:09.459 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:09.459 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:09.459 Host machine cpu family: x86_64 00:03:09.459 Host machine cpu: x86_64 00:03:09.459 Run-time dependency threads found: YES 00:03:09.459 Library dl found: YES 00:03:09.459 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:09.459 Run-time dependency json-c found: YES 0.17 00:03:09.459 Run-time dependency cmocka found: YES 1.1.7 00:03:09.459 Program pytest-3 found: NO 00:03:09.459 Program flake8 found: NO 00:03:09.459 Program misspell-fixer found: NO 00:03:09.459 Program restructuredtext-lint found: NO 00:03:09.459 Program valgrind found: YES (/usr/bin/valgrind) 00:03:09.459 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:09.459 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:09.459 Compiler for C supports arguments -Wwrite-strings: YES 00:03:09.459 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:09.459 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:09.459 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:09.459 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:09.459 Build targets in project: 8 00:03:09.459 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:09.459 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:09.459 00:03:09.459 libvfio-user 0.0.1 00:03:09.459 00:03:09.459 User defined options 00:03:09.459 buildtype : debug 00:03:09.459 default_library: static 00:03:09.459 libdir : /usr/local/lib 00:03:09.459 00:03:09.459 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:10.026 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:10.026 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:10.026 [2/36] Compiling C object samples/null.p/null.c.o 00:03:10.026 [3/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:10.026 [4/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:10.026 [5/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:10.026 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:10.026 [7/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:10.026 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:10.026 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:10.026 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:10.026 [11/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:10.026 [12/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:10.026 [13/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:10.026 [14/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:10.026 [15/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:10.026 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:10.026 [17/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:10.026 [18/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:10.026 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:10.026 [20/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:10.026 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:10.026 [22/36] Compiling C object samples/server.p/server.c.o 00:03:10.026 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:10.026 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:10.026 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:10.026 [26/36] Compiling C object samples/client.p/client.c.o 00:03:10.026 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:10.026 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:10.026 [29/36] Linking static target lib/libvfio-user.a 00:03:10.026 [30/36] Linking target samples/client 00:03:10.026 [31/36] Linking target samples/shadow_ioeventfd_server 00:03:10.026 [32/36] Linking target samples/lspci 00:03:10.026 [33/36] Linking target samples/server 00:03:10.026 [34/36] Linking target samples/gpio-pci-idio-16 00:03:10.026 [35/36] Linking target samples/null 00:03:10.026 [36/36] Linking target test/unit_tests 00:03:10.026 INFO: autodetecting backend as ninja 00:03:10.026 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:10.285 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:10.543 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:10.543 ninja: no work to do. 00:03:13.828 CC lib/ut/ut.o 00:03:13.828 CC lib/ut_mock/mock.o 00:03:13.828 CC lib/log/log_deprecated.o 00:03:13.828 CC lib/log/log.o 00:03:13.828 CC lib/log/log_flags.o 00:03:13.828 LIB libspdk_ut.a 00:03:13.828 LIB libspdk_ut_mock.a 00:03:13.828 LIB libspdk_log.a 00:03:14.395 CXX lib/trace_parser/trace.o 00:03:14.395 CC lib/util/base64.o 00:03:14.395 CC lib/util/cpuset.o 00:03:14.395 CC lib/util/bit_array.o 00:03:14.395 CC lib/util/crc32.o 00:03:14.395 CC lib/util/crc16.o 00:03:14.395 CC lib/util/crc64.o 00:03:14.395 CC lib/util/crc32c.o 00:03:14.395 CC lib/util/dif.o 00:03:14.395 CC lib/util/crc32_ieee.o 00:03:14.395 CC lib/ioat/ioat.o 00:03:14.395 CC lib/util/fd.o 00:03:14.395 CC lib/dma/dma.o 00:03:14.395 CC lib/util/file.o 00:03:14.395 CC lib/util/hexlify.o 00:03:14.395 CC lib/util/iov.o 00:03:14.395 CC lib/util/math.o 00:03:14.395 CC lib/util/pipe.o 00:03:14.395 CC lib/util/strerror_tls.o 00:03:14.395 CC lib/util/string.o 00:03:14.395 CC lib/util/uuid.o 00:03:14.395 CC lib/util/fd_group.o 00:03:14.395 CC lib/util/zipf.o 00:03:14.395 CC lib/util/xor.o 00:03:14.395 CC lib/vfio_user/host/vfio_user_pci.o 00:03:14.395 CC lib/vfio_user/host/vfio_user.o 00:03:14.395 LIB libspdk_dma.a 00:03:14.395 LIB libspdk_ioat.a 00:03:14.395 LIB libspdk_vfio_user.a 00:03:14.654 LIB libspdk_util.a 00:03:14.654 LIB libspdk_trace_parser.a 00:03:14.914 CC lib/idxd/idxd.o 00:03:14.914 CC lib/idxd/idxd_user.o 00:03:14.914 CC lib/rdma/common.o 00:03:14.914 CC lib/idxd/idxd_kernel.o 00:03:14.914 CC lib/rdma/rdma_verbs.o 00:03:14.914 CC lib/vmd/vmd.o 00:03:14.914 CC lib/vmd/led.o 00:03:14.914 CC lib/json/json_parse.o 00:03:14.914 CC lib/json/json_util.o 00:03:14.914 CC lib/json/json_write.o 00:03:14.914 CC lib/env_dpdk/memory.o 00:03:14.914 CC lib/conf/conf.o 00:03:14.914 CC lib/env_dpdk/env.o 00:03:14.914 CC lib/env_dpdk/init.o 00:03:14.914 CC lib/env_dpdk/pci.o 00:03:14.914 CC lib/env_dpdk/threads.o 00:03:14.914 CC lib/env_dpdk/pci_vmd.o 00:03:14.914 CC lib/env_dpdk/pci_ioat.o 00:03:14.914 CC lib/env_dpdk/pci_virtio.o 00:03:14.914 CC lib/env_dpdk/pci_event.o 00:03:14.914 CC lib/env_dpdk/pci_idxd.o 00:03:14.914 CC lib/env_dpdk/sigbus_handler.o 00:03:14.914 CC lib/env_dpdk/pci_dpdk.o 00:03:14.914 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:14.914 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:14.914 LIB libspdk_conf.a 00:03:14.914 LIB libspdk_rdma.a 00:03:14.914 LIB libspdk_json.a 00:03:15.173 LIB libspdk_idxd.a 00:03:15.173 LIB libspdk_vmd.a 00:03:15.434 CC lib/jsonrpc/jsonrpc_server.o 00:03:15.434 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:15.434 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:15.434 CC lib/jsonrpc/jsonrpc_client.o 00:03:15.434 LIB libspdk_jsonrpc.a 00:03:15.694 LIB libspdk_env_dpdk.a 00:03:15.694 CC lib/rpc/rpc.o 00:03:15.954 LIB libspdk_rpc.a 00:03:16.214 CC lib/sock/sock.o 00:03:16.214 CC lib/sock/sock_rpc.o 00:03:16.214 CC lib/notify/notify.o 00:03:16.214 CC lib/notify/notify_rpc.o 00:03:16.214 CC lib/trace/trace.o 00:03:16.214 CC lib/trace/trace_flags.o 00:03:16.214 CC lib/trace/trace_rpc.o 00:03:16.475 LIB libspdk_notify.a 00:03:16.475 LIB libspdk_trace.a 00:03:16.475 LIB libspdk_sock.a 00:03:16.735 CC lib/thread/thread.o 00:03:16.735 CC lib/thread/iobuf.o 00:03:16.735 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:16.735 CC lib/nvme/nvme_ctrlr.o 00:03:16.735 CC lib/nvme/nvme_ns.o 00:03:16.735 CC lib/nvme/nvme_fabric.o 00:03:16.735 CC lib/nvme/nvme_ns_cmd.o 00:03:16.735 CC lib/nvme/nvme_pcie.o 00:03:16.735 CC lib/nvme/nvme_pcie_common.o 00:03:16.735 CC lib/nvme/nvme.o 00:03:16.735 CC lib/nvme/nvme_qpair.o 00:03:16.735 CC lib/nvme/nvme_quirks.o 00:03:16.735 CC lib/nvme/nvme_transport.o 00:03:16.735 CC lib/nvme/nvme_discovery.o 00:03:16.735 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:16.735 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:16.735 CC lib/nvme/nvme_tcp.o 00:03:16.735 CC lib/nvme/nvme_opal.o 00:03:16.735 CC lib/nvme/nvme_io_msg.o 00:03:16.735 CC lib/nvme/nvme_poll_group.o 00:03:16.735 CC lib/nvme/nvme_zns.o 00:03:16.735 CC lib/nvme/nvme_cuse.o 00:03:16.735 CC lib/nvme/nvme_vfio_user.o 00:03:16.735 CC lib/nvme/nvme_rdma.o 00:03:17.675 LIB libspdk_thread.a 00:03:17.675 CC lib/virtio/virtio_vhost_user.o 00:03:17.675 CC lib/virtio/virtio.o 00:03:17.675 CC lib/virtio/virtio_vfio_user.o 00:03:17.675 CC lib/virtio/virtio_pci.o 00:03:17.675 CC lib/blob/blob_bs_dev.o 00:03:17.675 CC lib/blob/zeroes.o 00:03:17.675 CC lib/blob/blobstore.o 00:03:17.675 CC lib/accel/accel.o 00:03:17.675 CC lib/blob/request.o 00:03:17.675 CC lib/accel/accel_rpc.o 00:03:17.675 CC lib/accel/accel_sw.o 00:03:17.675 CC lib/vfu_tgt/tgt_endpoint.o 00:03:17.675 CC lib/vfu_tgt/tgt_rpc.o 00:03:17.675 CC lib/init/json_config.o 00:03:17.675 CC lib/init/subsystem.o 00:03:17.675 CC lib/init/subsystem_rpc.o 00:03:17.675 CC lib/init/rpc.o 00:03:17.934 LIB libspdk_init.a 00:03:17.934 LIB libspdk_virtio.a 00:03:17.934 LIB libspdk_vfu_tgt.a 00:03:17.934 LIB libspdk_nvme.a 00:03:18.193 CC lib/event/app.o 00:03:18.193 CC lib/event/reactor.o 00:03:18.193 CC lib/event/scheduler_static.o 00:03:18.193 CC lib/event/log_rpc.o 00:03:18.193 CC lib/event/app_rpc.o 00:03:18.452 LIB libspdk_accel.a 00:03:18.452 LIB libspdk_event.a 00:03:18.711 CC lib/bdev/bdev.o 00:03:18.711 CC lib/bdev/part.o 00:03:18.711 CC lib/bdev/bdev_rpc.o 00:03:18.711 CC lib/bdev/bdev_zone.o 00:03:18.711 CC lib/bdev/scsi_nvme.o 00:03:19.280 LIB libspdk_blob.a 00:03:19.540 CC lib/blobfs/blobfs.o 00:03:19.540 CC lib/lvol/lvol.o 00:03:19.540 CC lib/blobfs/tree.o 00:03:20.108 LIB libspdk_lvol.a 00:03:20.108 LIB libspdk_blobfs.a 00:03:20.367 LIB libspdk_bdev.a 00:03:20.627 CC lib/ftl/ftl_core.o 00:03:20.627 CC lib/ftl/ftl_init.o 00:03:20.627 CC lib/ftl/ftl_layout.o 00:03:20.627 CC lib/ftl/ftl_debug.o 00:03:20.627 CC lib/ftl/ftl_io.o 00:03:20.627 CC lib/ftl/ftl_sb.o 00:03:20.627 CC lib/ftl/ftl_l2p.o 00:03:20.627 CC lib/ftl/ftl_l2p_flat.o 00:03:20.627 CC lib/ftl/ftl_nv_cache.o 00:03:20.627 CC lib/ftl/ftl_band.o 00:03:20.627 CC lib/ftl/ftl_band_ops.o 00:03:20.627 CC lib/ftl/ftl_writer.o 00:03:20.627 CC lib/ftl/ftl_rq.o 00:03:20.627 CC lib/ftl/ftl_reloc.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt.o 00:03:20.627 CC lib/ftl/ftl_l2p_cache.o 00:03:20.627 CC lib/ftl/ftl_p2l.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:20.627 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:20.627 CC lib/ftl/utils/ftl_conf.o 00:03:20.627 CC lib/ftl/utils/ftl_md.o 00:03:20.627 CC lib/ftl/utils/ftl_mempool.o 00:03:20.627 CC lib/ftl/utils/ftl_bitmap.o 00:03:20.627 CC lib/ftl/utils/ftl_property.o 00:03:20.627 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:20.627 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:20.627 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:20.627 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:20.627 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:20.627 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:20.627 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:20.627 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:20.627 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:20.627 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:20.627 CC lib/ftl/base/ftl_base_dev.o 00:03:20.627 CC lib/ftl/base/ftl_base_bdev.o 00:03:20.627 CC lib/ftl/ftl_trace.o 00:03:20.627 CC lib/scsi/port.o 00:03:20.627 CC lib/scsi/dev.o 00:03:20.627 CC lib/scsi/lun.o 00:03:20.627 CC lib/scsi/scsi_rpc.o 00:03:20.627 CC lib/scsi/scsi.o 00:03:20.627 CC lib/scsi/scsi_pr.o 00:03:20.627 CC lib/scsi/scsi_bdev.o 00:03:20.627 CC lib/scsi/task.o 00:03:20.627 CC lib/nvmf/ctrlr_discovery.o 00:03:20.627 CC lib/nvmf/ctrlr.o 00:03:20.627 CC lib/ublk/ublk.o 00:03:20.627 CC lib/ublk/ublk_rpc.o 00:03:20.627 CC lib/nvmf/subsystem.o 00:03:20.627 CC lib/nvmf/ctrlr_bdev.o 00:03:20.627 CC lib/nvmf/nvmf.o 00:03:20.627 CC lib/nvmf/nvmf_rpc.o 00:03:20.627 CC lib/nvmf/transport.o 00:03:20.627 CC lib/nbd/nbd.o 00:03:20.627 CC lib/nvmf/vfio_user.o 00:03:20.627 CC lib/nvmf/rdma.o 00:03:20.627 CC lib/nvmf/tcp.o 00:03:20.627 CC lib/nbd/nbd_rpc.o 00:03:20.885 LIB libspdk_nbd.a 00:03:21.145 LIB libspdk_ublk.a 00:03:21.145 LIB libspdk_scsi.a 00:03:21.145 LIB libspdk_ftl.a 00:03:21.405 CC lib/iscsi/conn.o 00:03:21.405 CC lib/iscsi/init_grp.o 00:03:21.405 CC lib/iscsi/iscsi.o 00:03:21.405 CC lib/iscsi/md5.o 00:03:21.405 CC lib/iscsi/param.o 00:03:21.405 CC lib/iscsi/portal_grp.o 00:03:21.405 CC lib/iscsi/tgt_node.o 00:03:21.405 CC lib/iscsi/iscsi_subsystem.o 00:03:21.405 CC lib/iscsi/iscsi_rpc.o 00:03:21.405 CC lib/iscsi/task.o 00:03:21.405 CC lib/vhost/vhost_scsi.o 00:03:21.405 CC lib/vhost/vhost.o 00:03:21.405 CC lib/vhost/vhost_rpc.o 00:03:21.405 CC lib/vhost/rte_vhost_user.o 00:03:21.405 CC lib/vhost/vhost_blk.o 00:03:21.666 LIB libspdk_nvmf.a 00:03:21.926 LIB libspdk_vhost.a 00:03:22.187 LIB libspdk_iscsi.a 00:03:22.446 CC module/env_dpdk/env_dpdk_rpc.o 00:03:22.446 CC module/vfu_device/vfu_virtio.o 00:03:22.447 CC module/vfu_device/vfu_virtio_scsi.o 00:03:22.447 CC module/vfu_device/vfu_virtio_blk.o 00:03:22.447 CC module/vfu_device/vfu_virtio_rpc.o 00:03:22.705 LIB libspdk_env_dpdk_rpc.a 00:03:22.705 CC module/sock/posix/posix.o 00:03:22.705 CC module/accel/iaa/accel_iaa_rpc.o 00:03:22.705 CC module/accel/iaa/accel_iaa.o 00:03:22.705 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:22.705 CC module/scheduler/gscheduler/gscheduler.o 00:03:22.705 CC module/accel/dsa/accel_dsa.o 00:03:22.705 CC module/accel/dsa/accel_dsa_rpc.o 00:03:22.705 CC module/accel/ioat/accel_ioat_rpc.o 00:03:22.705 CC module/accel/ioat/accel_ioat.o 00:03:22.705 CC module/accel/error/accel_error.o 00:03:22.705 CC module/accel/error/accel_error_rpc.o 00:03:22.705 CC module/blob/bdev/blob_bdev.o 00:03:22.705 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:22.705 LIB libspdk_scheduler_dpdk_governor.a 00:03:22.705 LIB libspdk_scheduler_gscheduler.a 00:03:22.705 LIB libspdk_scheduler_dynamic.a 00:03:22.705 LIB libspdk_accel_ioat.a 00:03:22.705 LIB libspdk_accel_error.a 00:03:22.962 LIB libspdk_accel_iaa.a 00:03:22.962 LIB libspdk_accel_dsa.a 00:03:22.962 LIB libspdk_blob_bdev.a 00:03:22.962 LIB libspdk_vfu_device.a 00:03:22.962 LIB libspdk_sock_posix.a 00:03:23.221 CC module/bdev/raid/bdev_raid.o 00:03:23.221 CC module/bdev/raid/bdev_raid_rpc.o 00:03:23.221 CC module/bdev/raid/bdev_raid_sb.o 00:03:23.221 CC module/bdev/raid/raid0.o 00:03:23.221 CC module/bdev/raid/raid1.o 00:03:23.221 CC module/bdev/raid/concat.o 00:03:23.221 CC module/bdev/iscsi/bdev_iscsi.o 00:03:23.221 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:23.221 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:23.221 CC module/bdev/split/vbdev_split.o 00:03:23.221 CC module/bdev/lvol/vbdev_lvol.o 00:03:23.221 CC module/bdev/split/vbdev_split_rpc.o 00:03:23.221 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:23.221 CC module/bdev/nvme/bdev_nvme.o 00:03:23.221 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:23.221 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:23.221 CC module/bdev/nvme/vbdev_opal.o 00:03:23.221 CC module/bdev/null/bdev_null.o 00:03:23.221 CC module/bdev/nvme/nvme_rpc.o 00:03:23.221 CC module/bdev/nvme/bdev_mdns_client.o 00:03:23.221 CC module/bdev/ftl/bdev_ftl.o 00:03:23.221 CC module/bdev/null/bdev_null_rpc.o 00:03:23.221 CC module/bdev/gpt/gpt.o 00:03:23.221 CC module/bdev/error/vbdev_error.o 00:03:23.221 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:23.221 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:23.221 CC module/blobfs/bdev/blobfs_bdev.o 00:03:23.221 CC module/bdev/error/vbdev_error_rpc.o 00:03:23.221 CC module/bdev/gpt/vbdev_gpt.o 00:03:23.221 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:23.221 CC module/bdev/passthru/vbdev_passthru.o 00:03:23.221 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:23.221 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:23.221 CC module/bdev/malloc/bdev_malloc.o 00:03:23.221 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:23.221 CC module/bdev/delay/vbdev_delay.o 00:03:23.221 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:23.221 CC module/bdev/aio/bdev_aio.o 00:03:23.221 CC module/bdev/aio/bdev_aio_rpc.o 00:03:23.221 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:23.221 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:23.221 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:23.481 LIB libspdk_blobfs_bdev.a 00:03:23.481 LIB libspdk_bdev_split.a 00:03:23.481 LIB libspdk_bdev_null.a 00:03:23.481 LIB libspdk_bdev_error.a 00:03:23.481 LIB libspdk_bdev_gpt.a 00:03:23.481 LIB libspdk_bdev_ftl.a 00:03:23.481 LIB libspdk_bdev_passthru.a 00:03:23.481 LIB libspdk_bdev_aio.a 00:03:23.481 LIB libspdk_bdev_zone_block.a 00:03:23.481 LIB libspdk_bdev_iscsi.a 00:03:23.481 LIB libspdk_bdev_malloc.a 00:03:23.481 LIB libspdk_bdev_delay.a 00:03:23.741 LIB libspdk_bdev_lvol.a 00:03:23.741 LIB libspdk_bdev_virtio.a 00:03:23.741 LIB libspdk_bdev_raid.a 00:03:24.682 LIB libspdk_bdev_nvme.a 00:03:24.941 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:24.941 CC module/event/subsystems/vmd/vmd.o 00:03:24.941 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:24.941 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:24.941 CC module/event/subsystems/iobuf/iobuf.o 00:03:25.201 CC module/event/subsystems/sock/sock.o 00:03:25.201 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:25.201 CC module/event/subsystems/scheduler/scheduler.o 00:03:25.201 LIB libspdk_event_vfu_tgt.a 00:03:25.201 LIB libspdk_event_vmd.a 00:03:25.201 LIB libspdk_event_sock.a 00:03:25.201 LIB libspdk_event_iobuf.a 00:03:25.201 LIB libspdk_event_vhost_blk.a 00:03:25.201 LIB libspdk_event_scheduler.a 00:03:25.460 CC module/event/subsystems/accel/accel.o 00:03:25.720 LIB libspdk_event_accel.a 00:03:25.979 CC module/event/subsystems/bdev/bdev.o 00:03:25.979 LIB libspdk_event_bdev.a 00:03:26.239 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:26.239 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:26.239 CC module/event/subsystems/ublk/ublk.o 00:03:26.239 CC module/event/subsystems/scsi/scsi.o 00:03:26.239 CC module/event/subsystems/nbd/nbd.o 00:03:26.498 LIB libspdk_event_ublk.a 00:03:26.498 LIB libspdk_event_nbd.a 00:03:26.498 LIB libspdk_event_scsi.a 00:03:26.498 LIB libspdk_event_nvmf.a 00:03:26.756 CC module/event/subsystems/iscsi/iscsi.o 00:03:26.756 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:27.015 LIB libspdk_event_vhost_scsi.a 00:03:27.015 LIB libspdk_event_iscsi.a 00:03:27.279 TEST_HEADER include/spdk/accel.h 00:03:27.279 TEST_HEADER include/spdk/assert.h 00:03:27.279 TEST_HEADER include/spdk/accel_module.h 00:03:27.279 TEST_HEADER include/spdk/base64.h 00:03:27.279 CC test/rpc_client/rpc_client_test.o 00:03:27.279 TEST_HEADER include/spdk/barrier.h 00:03:27.279 TEST_HEADER include/spdk/bdev_module.h 00:03:27.279 TEST_HEADER include/spdk/bit_array.h 00:03:27.279 TEST_HEADER include/spdk/bdev_zone.h 00:03:27.279 TEST_HEADER include/spdk/bdev.h 00:03:27.279 TEST_HEADER include/spdk/bit_pool.h 00:03:27.279 TEST_HEADER include/spdk/blob_bdev.h 00:03:27.279 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:27.279 TEST_HEADER include/spdk/blob.h 00:03:27.279 TEST_HEADER include/spdk/blobfs.h 00:03:27.279 TEST_HEADER include/spdk/conf.h 00:03:27.279 TEST_HEADER include/spdk/config.h 00:03:27.279 TEST_HEADER include/spdk/cpuset.h 00:03:27.279 TEST_HEADER include/spdk/crc16.h 00:03:27.279 TEST_HEADER include/spdk/crc32.h 00:03:27.279 TEST_HEADER include/spdk/crc64.h 00:03:27.279 TEST_HEADER include/spdk/dif.h 00:03:27.279 TEST_HEADER include/spdk/endian.h 00:03:27.279 TEST_HEADER include/spdk/dma.h 00:03:27.279 TEST_HEADER include/spdk/env_dpdk.h 00:03:27.279 TEST_HEADER include/spdk/env.h 00:03:27.279 TEST_HEADER include/spdk/event.h 00:03:27.279 TEST_HEADER include/spdk/fd_group.h 00:03:27.279 TEST_HEADER include/spdk/fd.h 00:03:27.279 TEST_HEADER include/spdk/file.h 00:03:27.279 CC app/trace_record/trace_record.o 00:03:27.279 TEST_HEADER include/spdk/ftl.h 00:03:27.279 TEST_HEADER include/spdk/gpt_spec.h 00:03:27.279 TEST_HEADER include/spdk/hexlify.h 00:03:27.279 TEST_HEADER include/spdk/histogram_data.h 00:03:27.279 CC app/spdk_nvme_discover/discovery_aer.o 00:03:27.279 CXX app/trace/trace.o 00:03:27.279 TEST_HEADER include/spdk/idxd.h 00:03:27.279 TEST_HEADER include/spdk/idxd_spec.h 00:03:27.279 TEST_HEADER include/spdk/init.h 00:03:27.279 TEST_HEADER include/spdk/ioat.h 00:03:27.279 CC app/spdk_nvme_identify/identify.o 00:03:27.279 TEST_HEADER include/spdk/ioat_spec.h 00:03:27.279 TEST_HEADER include/spdk/iscsi_spec.h 00:03:27.279 TEST_HEADER include/spdk/json.h 00:03:27.279 TEST_HEADER include/spdk/likely.h 00:03:27.279 TEST_HEADER include/spdk/jsonrpc.h 00:03:27.279 TEST_HEADER include/spdk/log.h 00:03:27.279 TEST_HEADER include/spdk/lvol.h 00:03:27.279 TEST_HEADER include/spdk/memory.h 00:03:27.279 TEST_HEADER include/spdk/mmio.h 00:03:27.279 CC app/spdk_top/spdk_top.o 00:03:27.279 TEST_HEADER include/spdk/nbd.h 00:03:27.279 TEST_HEADER include/spdk/notify.h 00:03:27.279 TEST_HEADER include/spdk/nvme_intel.h 00:03:27.279 TEST_HEADER include/spdk/nvme.h 00:03:27.279 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:27.279 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:27.279 TEST_HEADER include/spdk/nvme_zns.h 00:03:27.279 TEST_HEADER include/spdk/nvme_spec.h 00:03:27.279 CC app/spdk_lspci/spdk_lspci.o 00:03:27.279 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:27.279 TEST_HEADER include/spdk/nvmf.h 00:03:27.279 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:27.279 TEST_HEADER include/spdk/nvmf_transport.h 00:03:27.279 TEST_HEADER include/spdk/nvmf_spec.h 00:03:27.279 TEST_HEADER include/spdk/opal.h 00:03:27.279 TEST_HEADER include/spdk/pci_ids.h 00:03:27.279 TEST_HEADER include/spdk/opal_spec.h 00:03:27.279 TEST_HEADER include/spdk/pipe.h 00:03:27.279 CC app/spdk_nvme_perf/perf.o 00:03:27.279 TEST_HEADER include/spdk/queue.h 00:03:27.279 TEST_HEADER include/spdk/reduce.h 00:03:27.279 TEST_HEADER include/spdk/rpc.h 00:03:27.279 TEST_HEADER include/spdk/scheduler.h 00:03:27.279 TEST_HEADER include/spdk/scsi.h 00:03:27.279 TEST_HEADER include/spdk/scsi_spec.h 00:03:27.279 TEST_HEADER include/spdk/sock.h 00:03:27.279 TEST_HEADER include/spdk/stdinc.h 00:03:27.279 TEST_HEADER include/spdk/string.h 00:03:27.279 TEST_HEADER include/spdk/thread.h 00:03:27.279 TEST_HEADER include/spdk/trace.h 00:03:27.279 TEST_HEADER include/spdk/tree.h 00:03:27.279 TEST_HEADER include/spdk/trace_parser.h 00:03:27.279 TEST_HEADER include/spdk/ublk.h 00:03:27.279 TEST_HEADER include/spdk/util.h 00:03:27.279 TEST_HEADER include/spdk/uuid.h 00:03:27.279 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:27.279 TEST_HEADER include/spdk/version.h 00:03:27.279 TEST_HEADER include/spdk/vhost.h 00:03:27.279 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:27.279 TEST_HEADER include/spdk/vmd.h 00:03:27.279 TEST_HEADER include/spdk/zipf.h 00:03:27.279 TEST_HEADER include/spdk/xor.h 00:03:27.279 CXX test/cpp_headers/accel.o 00:03:27.279 CXX test/cpp_headers/barrier.o 00:03:27.279 CXX test/cpp_headers/accel_module.o 00:03:27.279 CXX test/cpp_headers/assert.o 00:03:27.279 CXX test/cpp_headers/base64.o 00:03:27.279 CXX test/cpp_headers/bdev.o 00:03:27.279 CXX test/cpp_headers/bdev_module.o 00:03:27.279 CXX test/cpp_headers/bdev_zone.o 00:03:27.279 CXX test/cpp_headers/bit_array.o 00:03:27.279 CXX test/cpp_headers/blob_bdev.o 00:03:27.279 CXX test/cpp_headers/bit_pool.o 00:03:27.279 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:27.279 CXX test/cpp_headers/blobfs.o 00:03:27.279 CXX test/cpp_headers/blobfs_bdev.o 00:03:27.279 CXX test/cpp_headers/conf.o 00:03:27.279 CXX test/cpp_headers/blob.o 00:03:27.279 CXX test/cpp_headers/cpuset.o 00:03:27.279 CXX test/cpp_headers/config.o 00:03:27.279 CXX test/cpp_headers/crc16.o 00:03:27.279 CXX test/cpp_headers/crc32.o 00:03:27.279 CXX test/cpp_headers/crc64.o 00:03:27.279 CXX test/cpp_headers/dif.o 00:03:27.279 CXX test/cpp_headers/dma.o 00:03:27.279 CXX test/cpp_headers/endian.o 00:03:27.279 CXX test/cpp_headers/env_dpdk.o 00:03:27.279 CXX test/cpp_headers/env.o 00:03:27.279 CC app/spdk_dd/spdk_dd.o 00:03:27.279 CXX test/cpp_headers/event.o 00:03:27.279 CC app/iscsi_tgt/iscsi_tgt.o 00:03:27.279 CXX test/cpp_headers/fd_group.o 00:03:27.279 CXX test/cpp_headers/fd.o 00:03:27.279 CXX test/cpp_headers/file.o 00:03:27.279 CXX test/cpp_headers/ftl.o 00:03:27.279 CXX test/cpp_headers/gpt_spec.o 00:03:27.279 CC app/nvmf_tgt/nvmf_main.o 00:03:27.279 CXX test/cpp_headers/hexlify.o 00:03:27.279 CXX test/cpp_headers/histogram_data.o 00:03:27.279 CC app/vhost/vhost.o 00:03:27.279 CXX test/cpp_headers/idxd_spec.o 00:03:27.279 CXX test/cpp_headers/idxd.o 00:03:27.279 CXX test/cpp_headers/init.o 00:03:27.279 CC test/nvme/aer/aer.o 00:03:27.279 CC test/nvme/startup/startup.o 00:03:27.279 CC test/nvme/sgl/sgl.o 00:03:27.279 CC test/app/stub/stub.o 00:03:27.279 CC test/nvme/reset/reset.o 00:03:27.279 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:27.279 CC test/nvme/e2edp/nvme_dp.o 00:03:27.279 CC test/app/jsoncat/jsoncat.o 00:03:27.279 CC test/nvme/boot_partition/boot_partition.o 00:03:27.279 CC test/nvme/cuse/cuse.o 00:03:27.279 CC test/nvme/simple_copy/simple_copy.o 00:03:27.279 CC test/nvme/fdp/fdp.o 00:03:27.279 CC test/nvme/fused_ordering/fused_ordering.o 00:03:27.279 CC test/thread/poller_perf/poller_perf.o 00:03:27.279 CC test/app/histogram_perf/histogram_perf.o 00:03:27.279 CC test/thread/lock/spdk_lock.o 00:03:27.279 CC test/event/event_perf/event_perf.o 00:03:27.279 CXX test/cpp_headers/ioat.o 00:03:27.279 CC test/nvme/overhead/overhead.o 00:03:27.279 CC test/nvme/compliance/nvme_compliance.o 00:03:27.279 CC test/nvme/err_injection/err_injection.o 00:03:27.279 CC app/spdk_tgt/spdk_tgt.o 00:03:27.279 CC test/nvme/connect_stress/connect_stress.o 00:03:27.279 CC test/env/pci/pci_ut.o 00:03:27.279 CC test/nvme/reserve/reserve.o 00:03:27.279 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:27.279 CC examples/ioat/verify/verify.o 00:03:27.279 CC test/env/vtophys/vtophys.o 00:03:27.279 CC test/event/app_repeat/app_repeat.o 00:03:27.279 CC examples/vmd/led/led.o 00:03:27.279 CC test/env/memory/memory_ut.o 00:03:27.279 CC test/event/reactor/reactor.o 00:03:27.279 CC examples/util/zipf/zipf.o 00:03:27.279 CC test/event/reactor_perf/reactor_perf.o 00:03:27.279 CC examples/vmd/lsvmd/lsvmd.o 00:03:27.279 CC examples/accel/perf/accel_perf.o 00:03:27.279 CC examples/ioat/perf/perf.o 00:03:27.279 CC examples/idxd/perf/perf.o 00:03:27.545 CC app/fio/nvme/fio_plugin.o 00:03:27.545 CC examples/nvme/hello_world/hello_world.o 00:03:27.545 CC test/dma/test_dma/test_dma.o 00:03:27.545 CC test/accel/dif/dif.o 00:03:27.545 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:27.545 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:27.545 CC examples/nvme/abort/abort.o 00:03:27.545 CC examples/nvme/reconnect/reconnect.o 00:03:27.545 CC examples/nvme/arbitration/arbitration.o 00:03:27.545 CC examples/sock/hello_world/hello_sock.o 00:03:27.545 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:27.545 CC examples/nvme/hotplug/hotplug.o 00:03:27.545 CC test/app/bdev_svc/bdev_svc.o 00:03:27.545 CC test/blobfs/mkfs/mkfs.o 00:03:27.545 CC test/bdev/bdevio/bdevio.o 00:03:27.545 CC test/event/scheduler/scheduler.o 00:03:27.545 CC examples/nvmf/nvmf/nvmf.o 00:03:27.545 CC app/fio/bdev/fio_plugin.o 00:03:27.545 CC examples/thread/thread/thread_ex.o 00:03:27.545 CC examples/blob/cli/blobcli.o 00:03:27.545 LINK spdk_lspci 00:03:27.545 CC examples/bdev/hello_world/hello_bdev.o 00:03:27.545 CC examples/blob/hello_world/hello_blob.o 00:03:27.545 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:27.545 CC test/env/mem_callbacks/mem_callbacks.o 00:03:27.545 LINK rpc_client_test 00:03:27.545 CC examples/bdev/bdevperf/bdevperf.o 00:03:27.545 CC test/lvol/esnap/esnap.o 00:03:27.545 LINK spdk_nvme_discover 00:03:27.545 CXX test/cpp_headers/ioat_spec.o 00:03:27.545 CXX test/cpp_headers/iscsi_spec.o 00:03:27.545 CXX test/cpp_headers/json.o 00:03:27.545 CXX test/cpp_headers/jsonrpc.o 00:03:27.545 CXX test/cpp_headers/likely.o 00:03:27.545 CXX test/cpp_headers/log.o 00:03:27.545 CXX test/cpp_headers/lvol.o 00:03:27.545 LINK jsoncat 00:03:27.545 CXX test/cpp_headers/memory.o 00:03:27.545 CXX test/cpp_headers/mmio.o 00:03:27.545 CXX test/cpp_headers/nbd.o 00:03:27.545 CXX test/cpp_headers/notify.o 00:03:27.545 CXX test/cpp_headers/nvme.o 00:03:27.545 CXX test/cpp_headers/nvme_intel.o 00:03:27.545 CXX test/cpp_headers/nvme_ocssd.o 00:03:27.545 LINK spdk_trace_record 00:03:27.545 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:27.545 CXX test/cpp_headers/nvme_spec.o 00:03:27.545 CXX test/cpp_headers/nvme_zns.o 00:03:27.545 CXX test/cpp_headers/nvmf_cmd.o 00:03:27.545 LINK interrupt_tgt 00:03:27.545 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:27.545 CXX test/cpp_headers/nvmf.o 00:03:27.545 CXX test/cpp_headers/nvmf_spec.o 00:03:27.545 CXX test/cpp_headers/nvmf_transport.o 00:03:27.545 CXX test/cpp_headers/opal.o 00:03:27.545 LINK histogram_perf 00:03:27.545 LINK event_perf 00:03:27.545 CXX test/cpp_headers/opal_spec.o 00:03:27.545 CXX test/cpp_headers/pci_ids.o 00:03:27.545 LINK lsvmd 00:03:27.545 CXX test/cpp_headers/pipe.o 00:03:27.545 CXX test/cpp_headers/queue.o 00:03:27.545 CXX test/cpp_headers/reduce.o 00:03:27.545 LINK poller_perf 00:03:27.545 LINK reactor 00:03:27.545 LINK vtophys 00:03:27.545 CXX test/cpp_headers/rpc.o 00:03:27.545 LINK led 00:03:27.545 LINK reactor_perf 00:03:27.545 LINK nvmf_tgt 00:03:27.545 CXX test/cpp_headers/scheduler.o 00:03:27.545 LINK startup 00:03:27.545 CXX test/cpp_headers/scsi.o 00:03:27.545 CXX test/cpp_headers/scsi_spec.o 00:03:27.545 LINK env_dpdk_post_init 00:03:27.545 LINK app_repeat 00:03:27.545 LINK zipf 00:03:27.545 LINK vhost 00:03:27.545 CXX test/cpp_headers/sock.o 00:03:27.545 LINK doorbell_aers 00:03:27.545 LINK stub 00:03:27.545 LINK err_injection 00:03:27.545 LINK connect_stress 00:03:27.545 LINK iscsi_tgt 00:03:27.545 LINK boot_partition 00:03:27.545 LINK fused_ordering 00:03:27.545 CXX test/cpp_headers/stdinc.o 00:03:27.545 LINK reserve 00:03:27.545 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:27.545 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:27.545 LINK verify 00:03:27.545 LINK pmr_persistence 00:03:27.545 LINK spdk_tgt 00:03:27.545 LINK cmb_copy 00:03:27.545 LINK bdev_svc 00:03:27.545 LINK ioat_perf 00:03:27.809 LINK simple_copy 00:03:27.809 LINK reset 00:03:27.809 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:27.809 LINK hello_world 00:03:27.809 LINK mkfs 00:03:27.809 LINK aer 00:03:27.809 LINK sgl 00:03:27.809 LINK nvme_dp 00:03:27.809 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:27.809 LINK fdp 00:03:27.809 LINK hotplug 00:03:27.809 LINK hello_sock 00:03:27.809 LINK overhead 00:03:27.809 CXX test/cpp_headers/string.o 00:03:27.809 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:27.809 LINK scheduler 00:03:27.809 CXX test/cpp_headers/thread.o 00:03:27.809 CXX test/cpp_headers/trace.o 00:03:27.809 CXX test/cpp_headers/trace_parser.o 00:03:27.809 CXX test/cpp_headers/tree.o 00:03:27.809 CXX test/cpp_headers/ublk.o 00:03:27.809 CXX test/cpp_headers/util.o 00:03:27.809 CXX test/cpp_headers/uuid.o 00:03:27.809 LINK hello_bdev 00:03:27.809 CXX test/cpp_headers/version.o 00:03:27.809 CXX test/cpp_headers/vfio_user_pci.o 00:03:27.809 LINK thread 00:03:27.809 LINK spdk_trace 00:03:27.809 CXX test/cpp_headers/vfio_user_spec.o 00:03:27.809 LINK hello_blob 00:03:27.809 CXX test/cpp_headers/vhost.o 00:03:27.809 CXX test/cpp_headers/vmd.o 00:03:27.809 CXX test/cpp_headers/xor.o 00:03:27.809 CXX test/cpp_headers/zipf.o 00:03:27.809 LINK idxd_perf 00:03:27.809 LINK nvmf 00:03:27.809 LINK test_dma 00:03:27.809 LINK abort 00:03:27.809 LINK dif 00:03:27.809 LINK reconnect 00:03:27.809 LINK bdevio 00:03:27.809 LINK spdk_dd 00:03:27.809 LINK arbitration 00:03:28.070 LINK pci_ut 00:03:28.070 LINK nvme_compliance 00:03:28.070 LINK nvme_manage 00:03:28.070 LINK accel_perf 00:03:28.070 LINK nvme_fuzz 00:03:28.070 LINK blobcli 00:03:28.070 LINK spdk_bdev 00:03:28.070 LINK spdk_nvme 00:03:28.070 LINK llvm_vfio_fuzz 00:03:28.071 LINK vhost_fuzz 00:03:28.071 LINK mem_callbacks 00:03:28.071 LINK spdk_nvme_identify 00:03:28.330 LINK spdk_nvme_perf 00:03:28.330 LINK llvm_nvme_fuzz 00:03:28.330 LINK memory_ut 00:03:28.330 LINK bdevperf 00:03:28.589 LINK spdk_top 00:03:28.589 LINK cuse 00:03:29.157 LINK spdk_lock 00:03:29.157 LINK iscsi_fuzz 00:03:31.065 LINK esnap 00:03:31.325 00:03:31.325 real 0m23.704s 00:03:31.325 user 4m15.547s 00:03:31.325 sys 2m3.070s 00:03:31.325 10:50:29 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:31.325 10:50:29 -- common/autotest_common.sh@10 -- $ set +x 00:03:31.325 ************************************ 00:03:31.325 END TEST make 00:03:31.325 ************************************ 00:03:31.325 10:50:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:31.325 10:50:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:31.325 10:50:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:31.585 10:50:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:31.585 10:50:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:31.585 10:50:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:31.585 10:50:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:31.585 10:50:29 -- scripts/common.sh@335 -- # IFS=.-: 00:03:31.585 10:50:29 -- scripts/common.sh@335 -- # read -ra ver1 00:03:31.585 10:50:29 -- scripts/common.sh@336 -- # IFS=.-: 00:03:31.585 10:50:29 -- scripts/common.sh@336 -- # read -ra ver2 00:03:31.585 10:50:29 -- scripts/common.sh@337 -- # local 'op=<' 00:03:31.585 10:50:29 -- scripts/common.sh@339 -- # ver1_l=2 00:03:31.585 10:50:29 -- scripts/common.sh@340 -- # ver2_l=1 00:03:31.585 10:50:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:31.585 10:50:29 -- scripts/common.sh@343 -- # case "$op" in 00:03:31.585 10:50:29 -- scripts/common.sh@344 -- # : 1 00:03:31.585 10:50:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:31.585 10:50:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:31.585 10:50:29 -- scripts/common.sh@364 -- # decimal 1 00:03:31.585 10:50:29 -- scripts/common.sh@352 -- # local d=1 00:03:31.585 10:50:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:31.585 10:50:29 -- scripts/common.sh@354 -- # echo 1 00:03:31.585 10:50:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:31.585 10:50:29 -- scripts/common.sh@365 -- # decimal 2 00:03:31.585 10:50:29 -- scripts/common.sh@352 -- # local d=2 00:03:31.585 10:50:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:31.585 10:50:29 -- scripts/common.sh@354 -- # echo 2 00:03:31.585 10:50:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:31.585 10:50:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:31.585 10:50:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:31.585 10:50:29 -- scripts/common.sh@367 -- # return 0 00:03:31.585 10:50:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:31.585 10:50:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:31.585 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:31.585 --rc genhtml_branch_coverage=1 00:03:31.585 --rc genhtml_function_coverage=1 00:03:31.585 --rc genhtml_legend=1 00:03:31.585 --rc geninfo_all_blocks=1 00:03:31.585 --rc geninfo_unexecuted_blocks=1 00:03:31.585 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:31.585 ' 00:03:31.585 10:50:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:31.585 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:31.585 --rc genhtml_branch_coverage=1 00:03:31.585 --rc genhtml_function_coverage=1 00:03:31.585 --rc genhtml_legend=1 00:03:31.585 --rc geninfo_all_blocks=1 00:03:31.585 --rc geninfo_unexecuted_blocks=1 00:03:31.585 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:31.585 ' 00:03:31.585 10:50:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:31.585 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:31.585 --rc genhtml_branch_coverage=1 00:03:31.585 --rc genhtml_function_coverage=1 00:03:31.585 --rc genhtml_legend=1 00:03:31.585 --rc geninfo_all_blocks=1 00:03:31.585 --rc geninfo_unexecuted_blocks=1 00:03:31.585 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:31.585 ' 00:03:31.585 10:50:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:31.585 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:31.585 --rc genhtml_branch_coverage=1 00:03:31.585 --rc genhtml_function_coverage=1 00:03:31.585 --rc genhtml_legend=1 00:03:31.585 --rc geninfo_all_blocks=1 00:03:31.585 --rc geninfo_unexecuted_blocks=1 00:03:31.585 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:31.585 ' 00:03:31.585 10:50:29 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:31.585 10:50:29 -- nvmf/common.sh@7 -- # uname -s 00:03:31.585 10:50:29 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:31.585 10:50:29 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:31.585 10:50:29 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:31.585 10:50:29 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:31.585 10:50:29 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:31.585 10:50:29 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:31.585 10:50:29 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:31.585 10:50:29 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:31.585 10:50:29 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:31.585 10:50:29 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:31.585 10:50:29 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:31.585 10:50:29 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:31.585 10:50:29 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:31.585 10:50:29 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:31.585 10:50:30 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:31.585 10:50:30 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:31.585 10:50:30 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:31.585 10:50:30 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:31.585 10:50:30 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:31.585 10:50:30 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:31.585 10:50:30 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:31.585 10:50:30 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:31.585 10:50:30 -- paths/export.sh@5 -- # export PATH 00:03:31.585 10:50:30 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:31.585 10:50:30 -- nvmf/common.sh@46 -- # : 0 00:03:31.585 10:50:30 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:31.585 10:50:30 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:31.585 10:50:30 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:31.585 10:50:30 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:31.585 10:50:30 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:31.585 10:50:30 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:31.585 10:50:30 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:31.585 10:50:30 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:31.585 10:50:30 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:31.585 10:50:30 -- spdk/autotest.sh@32 -- # uname -s 00:03:31.585 10:50:30 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:31.585 10:50:30 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:31.585 10:50:30 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:31.585 10:50:30 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:31.585 10:50:30 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:31.585 10:50:30 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:31.585 10:50:30 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:31.585 10:50:30 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:31.585 10:50:30 -- spdk/autotest.sh@48 -- # udevadm_pid=577653 00:03:31.585 10:50:30 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:31.585 10:50:30 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:31.585 10:50:30 -- spdk/autotest.sh@54 -- # echo 577655 00:03:31.585 10:50:30 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:31.585 10:50:30 -- spdk/autotest.sh@56 -- # echo 577656 00:03:31.585 10:50:30 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:31.585 10:50:30 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:31.585 10:50:30 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:31.585 10:50:30 -- spdk/autotest.sh@60 -- # echo 577657 00:03:31.585 10:50:30 -- spdk/autotest.sh@62 -- # echo 577658 00:03:31.585 10:50:30 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:31.585 10:50:30 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:31.585 10:50:30 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:31.585 10:50:30 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:31.585 10:50:30 -- common/autotest_common.sh@10 -- # set +x 00:03:31.585 10:50:30 -- spdk/autotest.sh@70 -- # create_test_list 00:03:31.585 10:50:30 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:31.585 10:50:30 -- common/autotest_common.sh@10 -- # set +x 00:03:31.585 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:31.585 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:31.585 10:50:30 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:31.585 10:50:30 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:31.585 10:50:30 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:31.585 10:50:30 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:31.585 10:50:30 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:31.585 10:50:30 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:31.585 10:50:30 -- common/autotest_common.sh@1450 -- # uname 00:03:31.585 10:50:30 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:31.585 10:50:30 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:31.585 10:50:30 -- common/autotest_common.sh@1470 -- # uname 00:03:31.585 10:50:30 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:31.585 10:50:30 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:31.585 10:50:30 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:31.585 lcov: LCOV version 1.15 00:03:31.585 10:50:30 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:33.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:33.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:33.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:45.723 10:50:44 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:45.723 10:50:44 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:45.723 10:50:44 -- common/autotest_common.sh@10 -- # set +x 00:03:45.723 10:50:44 -- spdk/autotest.sh@89 -- # rm -f 00:03:45.723 10:50:44 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:49.931 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:49.931 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:49.931 10:50:48 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:49.931 10:50:48 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:49.931 10:50:48 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:49.931 10:50:48 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:49.931 10:50:48 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:49.931 10:50:48 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:49.931 10:50:48 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:49.931 10:50:48 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:49.931 10:50:48 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:49.931 10:50:48 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:49.931 10:50:48 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:49.931 10:50:48 -- spdk/autotest.sh@108 -- # grep -v p 00:03:49.931 10:50:48 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:49.931 10:50:48 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:49.931 10:50:48 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:49.931 10:50:48 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:49.931 10:50:48 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:49.931 No valid GPT data, bailing 00:03:49.931 10:50:48 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:49.931 10:50:48 -- scripts/common.sh@393 -- # pt= 00:03:49.931 10:50:48 -- scripts/common.sh@394 -- # return 1 00:03:49.931 10:50:48 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:49.931 1+0 records in 00:03:49.931 1+0 records out 00:03:49.931 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.005685 s, 184 MB/s 00:03:49.931 10:50:48 -- spdk/autotest.sh@116 -- # sync 00:03:49.931 10:50:48 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:49.931 10:50:48 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:49.931 10:50:48 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:58.058 10:50:55 -- spdk/autotest.sh@122 -- # uname -s 00:03:58.058 10:50:55 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:58.058 10:50:55 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:58.058 10:50:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:58.058 10:50:55 -- common/autotest_common.sh@10 -- # set +x 00:03:58.058 ************************************ 00:03:58.058 START TEST setup.sh 00:03:58.058 ************************************ 00:03:58.058 10:50:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:58.058 * Looking for test storage... 00:03:58.058 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:58.058 10:50:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:58.058 10:50:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:58.058 10:50:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:58.058 10:50:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:58.058 10:50:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:58.058 10:50:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:58.058 10:50:55 -- scripts/common.sh@335 -- # IFS=.-: 00:03:58.058 10:50:55 -- scripts/common.sh@335 -- # read -ra ver1 00:03:58.058 10:50:55 -- scripts/common.sh@336 -- # IFS=.-: 00:03:58.058 10:50:55 -- scripts/common.sh@336 -- # read -ra ver2 00:03:58.058 10:50:55 -- scripts/common.sh@337 -- # local 'op=<' 00:03:58.058 10:50:55 -- scripts/common.sh@339 -- # ver1_l=2 00:03:58.058 10:50:55 -- scripts/common.sh@340 -- # ver2_l=1 00:03:58.058 10:50:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:58.058 10:50:55 -- scripts/common.sh@343 -- # case "$op" in 00:03:58.058 10:50:55 -- scripts/common.sh@344 -- # : 1 00:03:58.058 10:50:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:58.058 10:50:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:58.058 10:50:55 -- scripts/common.sh@364 -- # decimal 1 00:03:58.058 10:50:55 -- scripts/common.sh@352 -- # local d=1 00:03:58.058 10:50:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:58.058 10:50:55 -- scripts/common.sh@354 -- # echo 1 00:03:58.058 10:50:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:58.058 10:50:55 -- scripts/common.sh@365 -- # decimal 2 00:03:58.058 10:50:55 -- scripts/common.sh@352 -- # local d=2 00:03:58.058 10:50:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:58.058 10:50:55 -- scripts/common.sh@354 -- # echo 2 00:03:58.058 10:50:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:58.058 10:50:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:58.058 10:50:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:58.058 10:50:55 -- scripts/common.sh@367 -- # return 0 00:03:58.058 10:50:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:58.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.058 --rc genhtml_branch_coverage=1 00:03:58.058 --rc genhtml_function_coverage=1 00:03:58.058 --rc genhtml_legend=1 00:03:58.058 --rc geninfo_all_blocks=1 00:03:58.058 --rc geninfo_unexecuted_blocks=1 00:03:58.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.058 ' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:58.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.058 --rc genhtml_branch_coverage=1 00:03:58.058 --rc genhtml_function_coverage=1 00:03:58.058 --rc genhtml_legend=1 00:03:58.058 --rc geninfo_all_blocks=1 00:03:58.058 --rc geninfo_unexecuted_blocks=1 00:03:58.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.058 ' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:58.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.058 --rc genhtml_branch_coverage=1 00:03:58.058 --rc genhtml_function_coverage=1 00:03:58.058 --rc genhtml_legend=1 00:03:58.058 --rc geninfo_all_blocks=1 00:03:58.058 --rc geninfo_unexecuted_blocks=1 00:03:58.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.058 ' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:58.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.058 --rc genhtml_branch_coverage=1 00:03:58.058 --rc genhtml_function_coverage=1 00:03:58.058 --rc genhtml_legend=1 00:03:58.058 --rc geninfo_all_blocks=1 00:03:58.058 --rc geninfo_unexecuted_blocks=1 00:03:58.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.058 ' 00:03:58.058 10:50:55 -- setup/test-setup.sh@10 -- # uname -s 00:03:58.058 10:50:55 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:58.058 10:50:55 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:58.058 10:50:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:58.058 10:50:55 -- common/autotest_common.sh@10 -- # set +x 00:03:58.058 ************************************ 00:03:58.058 START TEST acl 00:03:58.058 ************************************ 00:03:58.058 10:50:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:58.058 * Looking for test storage... 00:03:58.058 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:58.058 10:50:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:58.058 10:50:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:58.058 10:50:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:58.058 10:50:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:58.058 10:50:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:58.058 10:50:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:58.058 10:50:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:58.058 10:50:55 -- scripts/common.sh@335 -- # IFS=.-: 00:03:58.058 10:50:55 -- scripts/common.sh@335 -- # read -ra ver1 00:03:58.058 10:50:55 -- scripts/common.sh@336 -- # IFS=.-: 00:03:58.058 10:50:55 -- scripts/common.sh@336 -- # read -ra ver2 00:03:58.058 10:50:55 -- scripts/common.sh@337 -- # local 'op=<' 00:03:58.058 10:50:55 -- scripts/common.sh@339 -- # ver1_l=2 00:03:58.058 10:50:55 -- scripts/common.sh@340 -- # ver2_l=1 00:03:58.058 10:50:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:58.058 10:50:55 -- scripts/common.sh@343 -- # case "$op" in 00:03:58.058 10:50:55 -- scripts/common.sh@344 -- # : 1 00:03:58.058 10:50:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:58.058 10:50:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:58.058 10:50:55 -- scripts/common.sh@364 -- # decimal 1 00:03:58.058 10:50:55 -- scripts/common.sh@352 -- # local d=1 00:03:58.058 10:50:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:58.058 10:50:55 -- scripts/common.sh@354 -- # echo 1 00:03:58.058 10:50:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:58.058 10:50:55 -- scripts/common.sh@365 -- # decimal 2 00:03:58.058 10:50:55 -- scripts/common.sh@352 -- # local d=2 00:03:58.058 10:50:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:58.058 10:50:55 -- scripts/common.sh@354 -- # echo 2 00:03:58.058 10:50:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:58.058 10:50:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:58.058 10:50:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:58.059 10:50:55 -- scripts/common.sh@367 -- # return 0 00:03:58.059 10:50:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:58.059 10:50:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:58.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.059 --rc genhtml_branch_coverage=1 00:03:58.059 --rc genhtml_function_coverage=1 00:03:58.059 --rc genhtml_legend=1 00:03:58.059 --rc geninfo_all_blocks=1 00:03:58.059 --rc geninfo_unexecuted_blocks=1 00:03:58.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.059 ' 00:03:58.059 10:50:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:58.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.059 --rc genhtml_branch_coverage=1 00:03:58.059 --rc genhtml_function_coverage=1 00:03:58.059 --rc genhtml_legend=1 00:03:58.059 --rc geninfo_all_blocks=1 00:03:58.059 --rc geninfo_unexecuted_blocks=1 00:03:58.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.059 ' 00:03:58.059 10:50:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:58.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.059 --rc genhtml_branch_coverage=1 00:03:58.059 --rc genhtml_function_coverage=1 00:03:58.059 --rc genhtml_legend=1 00:03:58.059 --rc geninfo_all_blocks=1 00:03:58.059 --rc geninfo_unexecuted_blocks=1 00:03:58.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.059 ' 00:03:58.059 10:50:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:58.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:58.059 --rc genhtml_branch_coverage=1 00:03:58.059 --rc genhtml_function_coverage=1 00:03:58.059 --rc genhtml_legend=1 00:03:58.059 --rc geninfo_all_blocks=1 00:03:58.059 --rc geninfo_unexecuted_blocks=1 00:03:58.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:58.059 ' 00:03:58.059 10:50:55 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:58.059 10:50:55 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:58.059 10:50:55 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:58.059 10:50:55 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:58.059 10:50:55 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:58.059 10:50:55 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:58.059 10:50:55 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:58.059 10:50:55 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:58.059 10:50:55 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:58.059 10:50:55 -- setup/acl.sh@12 -- # devs=() 00:03:58.059 10:50:55 -- setup/acl.sh@12 -- # declare -a devs 00:03:58.059 10:50:55 -- setup/acl.sh@13 -- # drivers=() 00:03:58.059 10:50:55 -- setup/acl.sh@13 -- # declare -A drivers 00:03:58.059 10:50:55 -- setup/acl.sh@51 -- # setup reset 00:03:58.059 10:50:55 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.059 10:50:55 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:01.345 10:50:59 -- setup/acl.sh@52 -- # collect_setup_devs 00:04:01.345 10:50:59 -- setup/acl.sh@16 -- # local dev driver 00:04:01.345 10:50:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:01.345 10:50:59 -- setup/acl.sh@15 -- # setup output status 00:04:01.345 10:50:59 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.345 10:50:59 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:03.874 Hugepages 00:04:03.874 node hugesize free / total 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # continue 00:04:03.874 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # continue 00:04:03.874 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # continue 00:04:03.874 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:03.874 00:04:03.874 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:03.874 10:51:02 -- setup/acl.sh@19 -- # continue 00:04:03.874 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # continue 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:04.132 10:51:02 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:04.132 10:51:02 -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:04.132 10:51:02 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:04.132 10:51:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.132 10:51:02 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:04.132 10:51:02 -- setup/acl.sh@54 -- # run_test denied denied 00:04:04.132 10:51:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:04.132 10:51:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:04.132 10:51:02 -- common/autotest_common.sh@10 -- # set +x 00:04:04.132 ************************************ 00:04:04.132 START TEST denied 00:04:04.132 ************************************ 00:04:04.132 10:51:02 -- common/autotest_common.sh@1114 -- # denied 00:04:04.132 10:51:02 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:04.132 10:51:02 -- setup/acl.sh@38 -- # setup output config 00:04:04.132 10:51:02 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:04.132 10:51:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.132 10:51:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:08.326 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:08.326 10:51:06 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:08.326 10:51:06 -- setup/acl.sh@28 -- # local dev driver 00:04:08.326 10:51:06 -- setup/acl.sh@30 -- # for dev in "$@" 00:04:08.326 10:51:06 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:08.326 10:51:06 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:08.326 10:51:06 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:08.326 10:51:06 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:08.326 10:51:06 -- setup/acl.sh@41 -- # setup reset 00:04:08.326 10:51:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:08.326 10:51:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:12.514 00:04:12.514 real 0m8.347s 00:04:12.514 user 0m2.738s 00:04:12.514 sys 0m5.006s 00:04:12.514 10:51:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:12.514 10:51:11 -- common/autotest_common.sh@10 -- # set +x 00:04:12.514 ************************************ 00:04:12.514 END TEST denied 00:04:12.514 ************************************ 00:04:12.514 10:51:11 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:12.514 10:51:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:12.514 10:51:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:12.514 10:51:11 -- common/autotest_common.sh@10 -- # set +x 00:04:12.514 ************************************ 00:04:12.514 START TEST allowed 00:04:12.514 ************************************ 00:04:12.514 10:51:11 -- common/autotest_common.sh@1114 -- # allowed 00:04:12.514 10:51:11 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:12.514 10:51:11 -- setup/acl.sh@45 -- # setup output config 00:04:12.514 10:51:11 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:12.514 10:51:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.514 10:51:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:17.786 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:17.786 10:51:16 -- setup/acl.sh@47 -- # verify 00:04:17.786 10:51:16 -- setup/acl.sh@28 -- # local dev driver 00:04:17.786 10:51:16 -- setup/acl.sh@48 -- # setup reset 00:04:17.786 10:51:16 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:17.786 10:51:16 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:21.071 00:04:21.071 real 0m8.411s 00:04:21.071 user 0m2.131s 00:04:21.071 sys 0m4.669s 00:04:21.071 10:51:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:21.071 10:51:19 -- common/autotest_common.sh@10 -- # set +x 00:04:21.071 ************************************ 00:04:21.071 END TEST allowed 00:04:21.071 ************************************ 00:04:21.071 00:04:21.071 real 0m24.127s 00:04:21.071 user 0m7.403s 00:04:21.071 sys 0m14.756s 00:04:21.071 10:51:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:21.071 10:51:19 -- common/autotest_common.sh@10 -- # set +x 00:04:21.071 ************************************ 00:04:21.071 END TEST acl 00:04:21.071 ************************************ 00:04:21.071 10:51:19 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:21.071 10:51:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:21.071 10:51:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:21.071 10:51:19 -- common/autotest_common.sh@10 -- # set +x 00:04:21.071 ************************************ 00:04:21.071 START TEST hugepages 00:04:21.071 ************************************ 00:04:21.071 10:51:19 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:21.071 * Looking for test storage... 00:04:21.071 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:21.071 10:51:19 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:21.071 10:51:19 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:21.071 10:51:19 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:21.332 10:51:19 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:21.332 10:51:19 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:21.332 10:51:19 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:21.332 10:51:19 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:21.332 10:51:19 -- scripts/common.sh@335 -- # IFS=.-: 00:04:21.332 10:51:19 -- scripts/common.sh@335 -- # read -ra ver1 00:04:21.332 10:51:19 -- scripts/common.sh@336 -- # IFS=.-: 00:04:21.332 10:51:19 -- scripts/common.sh@336 -- # read -ra ver2 00:04:21.332 10:51:19 -- scripts/common.sh@337 -- # local 'op=<' 00:04:21.332 10:51:19 -- scripts/common.sh@339 -- # ver1_l=2 00:04:21.332 10:51:19 -- scripts/common.sh@340 -- # ver2_l=1 00:04:21.332 10:51:19 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:21.332 10:51:19 -- scripts/common.sh@343 -- # case "$op" in 00:04:21.332 10:51:19 -- scripts/common.sh@344 -- # : 1 00:04:21.332 10:51:19 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:21.332 10:51:19 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:21.332 10:51:19 -- scripts/common.sh@364 -- # decimal 1 00:04:21.332 10:51:19 -- scripts/common.sh@352 -- # local d=1 00:04:21.332 10:51:19 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:21.332 10:51:19 -- scripts/common.sh@354 -- # echo 1 00:04:21.332 10:51:19 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:21.332 10:51:19 -- scripts/common.sh@365 -- # decimal 2 00:04:21.332 10:51:19 -- scripts/common.sh@352 -- # local d=2 00:04:21.332 10:51:19 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:21.332 10:51:19 -- scripts/common.sh@354 -- # echo 2 00:04:21.332 10:51:19 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:21.332 10:51:19 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:21.332 10:51:19 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:21.332 10:51:19 -- scripts/common.sh@367 -- # return 0 00:04:21.332 10:51:19 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:21.332 10:51:19 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:21.332 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.332 --rc genhtml_branch_coverage=1 00:04:21.332 --rc genhtml_function_coverage=1 00:04:21.332 --rc genhtml_legend=1 00:04:21.332 --rc geninfo_all_blocks=1 00:04:21.332 --rc geninfo_unexecuted_blocks=1 00:04:21.332 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.332 ' 00:04:21.332 10:51:19 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:21.332 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.332 --rc genhtml_branch_coverage=1 00:04:21.332 --rc genhtml_function_coverage=1 00:04:21.332 --rc genhtml_legend=1 00:04:21.332 --rc geninfo_all_blocks=1 00:04:21.332 --rc geninfo_unexecuted_blocks=1 00:04:21.332 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.332 ' 00:04:21.333 10:51:19 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:21.333 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.333 --rc genhtml_branch_coverage=1 00:04:21.333 --rc genhtml_function_coverage=1 00:04:21.333 --rc genhtml_legend=1 00:04:21.333 --rc geninfo_all_blocks=1 00:04:21.333 --rc geninfo_unexecuted_blocks=1 00:04:21.333 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.333 ' 00:04:21.333 10:51:19 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:21.333 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.333 --rc genhtml_branch_coverage=1 00:04:21.333 --rc genhtml_function_coverage=1 00:04:21.333 --rc genhtml_legend=1 00:04:21.333 --rc geninfo_all_blocks=1 00:04:21.333 --rc geninfo_unexecuted_blocks=1 00:04:21.333 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.333 ' 00:04:21.333 10:51:19 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:21.333 10:51:19 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:21.333 10:51:19 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:21.333 10:51:19 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:21.333 10:51:19 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:21.333 10:51:19 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:21.333 10:51:19 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:21.333 10:51:19 -- setup/common.sh@18 -- # local node= 00:04:21.333 10:51:19 -- setup/common.sh@19 -- # local var val 00:04:21.333 10:51:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.333 10:51:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.333 10:51:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.333 10:51:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.333 10:51:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.333 10:51:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 39055060 kB' 'MemAvailable: 42780680 kB' 'Buffers: 9316 kB' 'Cached: 12817660 kB' 'SwapCached: 0 kB' 'Active: 9723748 kB' 'Inactive: 3688944 kB' 'Active(anon): 9306864 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589252 kB' 'Mapped: 153816 kB' 'Shmem: 8721148 kB' 'KReclaimable: 233316 kB' 'Slab: 901732 kB' 'SReclaimable: 233316 kB' 'SUnreclaim: 668416 kB' 'KernelStack: 21792 kB' 'PageTables: 7744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433340 kB' 'Committed_AS: 10588672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214016 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.333 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.333 10:51:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # continue 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.334 10:51:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.334 10:51:19 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:21.334 10:51:19 -- setup/common.sh@33 -- # echo 2048 00:04:21.334 10:51:19 -- setup/common.sh@33 -- # return 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:21.334 10:51:19 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:21.334 10:51:19 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:21.334 10:51:19 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:21.334 10:51:19 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:21.334 10:51:19 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:21.334 10:51:19 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:21.334 10:51:19 -- setup/hugepages.sh@207 -- # get_nodes 00:04:21.334 10:51:19 -- setup/hugepages.sh@27 -- # local node 00:04:21.334 10:51:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.334 10:51:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:21.334 10:51:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.334 10:51:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:21.334 10:51:19 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.334 10:51:19 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.334 10:51:19 -- setup/hugepages.sh@208 -- # clear_hp 00:04:21.334 10:51:19 -- setup/hugepages.sh@37 -- # local node hp 00:04:21.334 10:51:19 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:21.334 10:51:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.334 10:51:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.334 10:51:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:21.334 10:51:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.334 10:51:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.334 10:51:19 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:21.334 10:51:19 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:21.334 10:51:19 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:21.334 10:51:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:21.334 10:51:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:21.334 10:51:19 -- common/autotest_common.sh@10 -- # set +x 00:04:21.334 ************************************ 00:04:21.334 START TEST default_setup 00:04:21.334 ************************************ 00:04:21.334 10:51:19 -- common/autotest_common.sh@1114 -- # default_setup 00:04:21.334 10:51:19 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:21.334 10:51:19 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:21.334 10:51:19 -- setup/hugepages.sh@51 -- # shift 00:04:21.334 10:51:19 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:21.334 10:51:19 -- setup/hugepages.sh@52 -- # local node_ids 00:04:21.334 10:51:19 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:21.334 10:51:19 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:21.334 10:51:19 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:21.334 10:51:19 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:21.334 10:51:19 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:21.334 10:51:19 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:21.334 10:51:19 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:21.334 10:51:19 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:21.334 10:51:19 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:21.334 10:51:19 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:21.334 10:51:19 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:21.334 10:51:19 -- setup/hugepages.sh@73 -- # return 0 00:04:21.334 10:51:19 -- setup/hugepages.sh@137 -- # setup output 00:04:21.334 10:51:19 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.334 10:51:19 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:24.622 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:24.622 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:24.622 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:24.622 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:24.622 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:24.622 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:24.882 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:26.796 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:26.796 10:51:25 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:26.796 10:51:25 -- setup/hugepages.sh@89 -- # local node 00:04:26.796 10:51:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:26.796 10:51:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:26.796 10:51:25 -- setup/hugepages.sh@92 -- # local surp 00:04:26.796 10:51:25 -- setup/hugepages.sh@93 -- # local resv 00:04:26.796 10:51:25 -- setup/hugepages.sh@94 -- # local anon 00:04:26.796 10:51:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:26.796 10:51:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:26.796 10:51:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:26.796 10:51:25 -- setup/common.sh@18 -- # local node= 00:04:26.796 10:51:25 -- setup/common.sh@19 -- # local var val 00:04:26.796 10:51:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.796 10:51:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.796 10:51:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.796 10:51:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.796 10:51:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.796 10:51:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.796 10:51:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41204568 kB' 'MemAvailable: 44929824 kB' 'Buffers: 9316 kB' 'Cached: 12817788 kB' 'SwapCached: 0 kB' 'Active: 9725996 kB' 'Inactive: 3688944 kB' 'Active(anon): 9309112 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591196 kB' 'Mapped: 153760 kB' 'Shmem: 8721276 kB' 'KReclaimable: 232588 kB' 'Slab: 900096 kB' 'SReclaimable: 232588 kB' 'SUnreclaim: 667508 kB' 'KernelStack: 21888 kB' 'PageTables: 8280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10591756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.796 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.796 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.797 10:51:25 -- setup/common.sh@33 -- # echo 0 00:04:26.797 10:51:25 -- setup/common.sh@33 -- # return 0 00:04:26.797 10:51:25 -- setup/hugepages.sh@97 -- # anon=0 00:04:26.797 10:51:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:26.797 10:51:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.797 10:51:25 -- setup/common.sh@18 -- # local node= 00:04:26.797 10:51:25 -- setup/common.sh@19 -- # local var val 00:04:26.797 10:51:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.797 10:51:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.797 10:51:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.797 10:51:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.797 10:51:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.797 10:51:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41213496 kB' 'MemAvailable: 44938736 kB' 'Buffers: 9316 kB' 'Cached: 12817792 kB' 'SwapCached: 0 kB' 'Active: 9725404 kB' 'Inactive: 3688944 kB' 'Active(anon): 9308520 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590588 kB' 'Mapped: 153648 kB' 'Shmem: 8721280 kB' 'KReclaimable: 232556 kB' 'Slab: 900208 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667652 kB' 'KernelStack: 22032 kB' 'PageTables: 8112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10591768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.797 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.797 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.798 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.798 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.799 10:51:25 -- setup/common.sh@33 -- # echo 0 00:04:26.799 10:51:25 -- setup/common.sh@33 -- # return 0 00:04:26.799 10:51:25 -- setup/hugepages.sh@99 -- # surp=0 00:04:26.799 10:51:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:26.799 10:51:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:26.799 10:51:25 -- setup/common.sh@18 -- # local node= 00:04:26.799 10:51:25 -- setup/common.sh@19 -- # local var val 00:04:26.799 10:51:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.799 10:51:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.799 10:51:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.799 10:51:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.799 10:51:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.799 10:51:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41210440 kB' 'MemAvailable: 44935680 kB' 'Buffers: 9316 kB' 'Cached: 12817804 kB' 'SwapCached: 0 kB' 'Active: 9725816 kB' 'Inactive: 3688944 kB' 'Active(anon): 9308932 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591016 kB' 'Mapped: 153648 kB' 'Shmem: 8721292 kB' 'KReclaimable: 232556 kB' 'Slab: 900184 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667628 kB' 'KernelStack: 22032 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10591784 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214496 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.799 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.799 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.800 10:51:25 -- setup/common.sh@33 -- # echo 0 00:04:26.800 10:51:25 -- setup/common.sh@33 -- # return 0 00:04:26.800 10:51:25 -- setup/hugepages.sh@100 -- # resv=0 00:04:26.800 10:51:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:26.800 nr_hugepages=1024 00:04:26.800 10:51:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:26.800 resv_hugepages=0 00:04:26.800 10:51:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:26.800 surplus_hugepages=0 00:04:26.800 10:51:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:26.800 anon_hugepages=0 00:04:26.800 10:51:25 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.800 10:51:25 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:26.800 10:51:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:26.800 10:51:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:26.800 10:51:25 -- setup/common.sh@18 -- # local node= 00:04:26.800 10:51:25 -- setup/common.sh@19 -- # local var val 00:04:26.800 10:51:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.800 10:51:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.800 10:51:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.800 10:51:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.800 10:51:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.800 10:51:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41210316 kB' 'MemAvailable: 44935556 kB' 'Buffers: 9316 kB' 'Cached: 12817816 kB' 'SwapCached: 0 kB' 'Active: 9725392 kB' 'Inactive: 3688944 kB' 'Active(anon): 9308508 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590536 kB' 'Mapped: 153648 kB' 'Shmem: 8721304 kB' 'KReclaimable: 232556 kB' 'Slab: 900184 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667628 kB' 'KernelStack: 21984 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10591800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.800 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.800 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.801 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.801 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.802 10:51:25 -- setup/common.sh@33 -- # echo 1024 00:04:26.802 10:51:25 -- setup/common.sh@33 -- # return 0 00:04:26.802 10:51:25 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.802 10:51:25 -- setup/hugepages.sh@112 -- # get_nodes 00:04:26.802 10:51:25 -- setup/hugepages.sh@27 -- # local node 00:04:26.802 10:51:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.802 10:51:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:26.802 10:51:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.802 10:51:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:26.802 10:51:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:26.802 10:51:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:26.802 10:51:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:26.802 10:51:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:26.802 10:51:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:26.802 10:51:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.802 10:51:25 -- setup/common.sh@18 -- # local node=0 00:04:26.802 10:51:25 -- setup/common.sh@19 -- # local var val 00:04:26.802 10:51:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.802 10:51:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.802 10:51:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:26.802 10:51:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:26.802 10:51:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.802 10:51:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25361512 kB' 'MemUsed: 7223856 kB' 'SwapCached: 0 kB' 'Active: 3408792 kB' 'Inactive: 184604 kB' 'Active(anon): 3237888 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318456 kB' 'Mapped: 62092 kB' 'AnonPages: 278104 kB' 'Shmem: 2962948 kB' 'KernelStack: 12216 kB' 'PageTables: 4592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121128 kB' 'Slab: 436480 kB' 'SReclaimable: 121128 kB' 'SUnreclaim: 315352 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.802 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.802 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # continue 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.803 10:51:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.803 10:51:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.803 10:51:25 -- setup/common.sh@33 -- # echo 0 00:04:26.803 10:51:25 -- setup/common.sh@33 -- # return 0 00:04:26.803 10:51:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:26.803 10:51:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:26.803 10:51:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:26.803 10:51:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:26.803 10:51:25 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:26.803 node0=1024 expecting 1024 00:04:26.803 10:51:25 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:26.803 00:04:26.803 real 0m5.322s 00:04:26.803 user 0m1.460s 00:04:26.803 sys 0m2.377s 00:04:26.803 10:51:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:26.803 10:51:25 -- common/autotest_common.sh@10 -- # set +x 00:04:26.803 ************************************ 00:04:26.803 END TEST default_setup 00:04:26.803 ************************************ 00:04:26.803 10:51:25 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:26.803 10:51:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:26.803 10:51:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:26.803 10:51:25 -- common/autotest_common.sh@10 -- # set +x 00:04:26.803 ************************************ 00:04:26.803 START TEST per_node_1G_alloc 00:04:26.803 ************************************ 00:04:26.803 10:51:25 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:04:26.803 10:51:25 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:26.803 10:51:25 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:26.803 10:51:25 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:26.803 10:51:25 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:26.803 10:51:25 -- setup/hugepages.sh@51 -- # shift 00:04:26.803 10:51:25 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:26.803 10:51:25 -- setup/hugepages.sh@52 -- # local node_ids 00:04:26.803 10:51:25 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:26.803 10:51:25 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:26.803 10:51:25 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:26.803 10:51:25 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:26.803 10:51:25 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:26.803 10:51:25 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:26.803 10:51:25 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:26.803 10:51:25 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:26.803 10:51:25 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:26.803 10:51:25 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:26.803 10:51:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:26.803 10:51:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:26.803 10:51:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:26.803 10:51:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:26.803 10:51:25 -- setup/hugepages.sh@73 -- # return 0 00:04:26.803 10:51:25 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:26.803 10:51:25 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:26.803 10:51:25 -- setup/hugepages.sh@146 -- # setup output 00:04:26.803 10:51:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.803 10:51:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:30.093 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.093 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:30.093 10:51:28 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:30.093 10:51:28 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:30.093 10:51:28 -- setup/hugepages.sh@89 -- # local node 00:04:30.093 10:51:28 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:30.093 10:51:28 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:30.093 10:51:28 -- setup/hugepages.sh@92 -- # local surp 00:04:30.093 10:51:28 -- setup/hugepages.sh@93 -- # local resv 00:04:30.093 10:51:28 -- setup/hugepages.sh@94 -- # local anon 00:04:30.093 10:51:28 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:30.093 10:51:28 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:30.093 10:51:28 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:30.093 10:51:28 -- setup/common.sh@18 -- # local node= 00:04:30.093 10:51:28 -- setup/common.sh@19 -- # local var val 00:04:30.093 10:51:28 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.093 10:51:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.093 10:51:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.093 10:51:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.093 10:51:28 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.093 10:51:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.093 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.093 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41258088 kB' 'MemAvailable: 44983328 kB' 'Buffers: 9316 kB' 'Cached: 12817908 kB' 'SwapCached: 0 kB' 'Active: 9726004 kB' 'Inactive: 3688944 kB' 'Active(anon): 9309120 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591036 kB' 'Mapped: 153772 kB' 'Shmem: 8721396 kB' 'KReclaimable: 232556 kB' 'Slab: 900484 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667928 kB' 'KernelStack: 21792 kB' 'PageTables: 7740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10588096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.094 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.094 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.357 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.358 10:51:28 -- setup/common.sh@33 -- # echo 0 00:04:30.358 10:51:28 -- setup/common.sh@33 -- # return 0 00:04:30.358 10:51:28 -- setup/hugepages.sh@97 -- # anon=0 00:04:30.358 10:51:28 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:30.358 10:51:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.358 10:51:28 -- setup/common.sh@18 -- # local node= 00:04:30.358 10:51:28 -- setup/common.sh@19 -- # local var val 00:04:30.358 10:51:28 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.358 10:51:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.358 10:51:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.358 10:51:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.358 10:51:28 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.358 10:51:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41258024 kB' 'MemAvailable: 44983264 kB' 'Buffers: 9316 kB' 'Cached: 12817908 kB' 'SwapCached: 0 kB' 'Active: 9725480 kB' 'Inactive: 3688944 kB' 'Active(anon): 9308596 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590428 kB' 'Mapped: 153644 kB' 'Shmem: 8721396 kB' 'KReclaimable: 232556 kB' 'Slab: 900492 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667936 kB' 'KernelStack: 21776 kB' 'PageTables: 7664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10588108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.358 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.358 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.359 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.359 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.360 10:51:28 -- setup/common.sh@33 -- # echo 0 00:04:30.360 10:51:28 -- setup/common.sh@33 -- # return 0 00:04:30.360 10:51:28 -- setup/hugepages.sh@99 -- # surp=0 00:04:30.360 10:51:28 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:30.360 10:51:28 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:30.360 10:51:28 -- setup/common.sh@18 -- # local node= 00:04:30.360 10:51:28 -- setup/common.sh@19 -- # local var val 00:04:30.360 10:51:28 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.360 10:51:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.360 10:51:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.360 10:51:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.360 10:51:28 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.360 10:51:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41257548 kB' 'MemAvailable: 44982788 kB' 'Buffers: 9316 kB' 'Cached: 12817920 kB' 'SwapCached: 0 kB' 'Active: 9726148 kB' 'Inactive: 3688944 kB' 'Active(anon): 9309264 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591144 kB' 'Mapped: 153644 kB' 'Shmem: 8721408 kB' 'KReclaimable: 232556 kB' 'Slab: 900492 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667936 kB' 'KernelStack: 21840 kB' 'PageTables: 7896 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10599780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.360 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.360 10:51:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.361 10:51:28 -- setup/common.sh@33 -- # echo 0 00:04:30.361 10:51:28 -- setup/common.sh@33 -- # return 0 00:04:30.361 10:51:28 -- setup/hugepages.sh@100 -- # resv=0 00:04:30.361 10:51:28 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:30.361 nr_hugepages=1024 00:04:30.361 10:51:28 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:30.361 resv_hugepages=0 00:04:30.361 10:51:28 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:30.361 surplus_hugepages=0 00:04:30.361 10:51:28 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:30.361 anon_hugepages=0 00:04:30.361 10:51:28 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:30.361 10:51:28 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:30.361 10:51:28 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:30.361 10:51:28 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:30.361 10:51:28 -- setup/common.sh@18 -- # local node= 00:04:30.361 10:51:28 -- setup/common.sh@19 -- # local var val 00:04:30.361 10:51:28 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.361 10:51:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.361 10:51:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.361 10:51:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.361 10:51:28 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.361 10:51:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.361 10:51:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41257684 kB' 'MemAvailable: 44982924 kB' 'Buffers: 9316 kB' 'Cached: 12817952 kB' 'SwapCached: 0 kB' 'Active: 9724832 kB' 'Inactive: 3688944 kB' 'Active(anon): 9307948 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589696 kB' 'Mapped: 153644 kB' 'Shmem: 8721440 kB' 'KReclaimable: 232556 kB' 'Slab: 900496 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667940 kB' 'KernelStack: 21712 kB' 'PageTables: 7440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10587900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.361 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.361 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.362 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.362 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.363 10:51:28 -- setup/common.sh@33 -- # echo 1024 00:04:30.363 10:51:28 -- setup/common.sh@33 -- # return 0 00:04:30.363 10:51:28 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:30.363 10:51:28 -- setup/hugepages.sh@112 -- # get_nodes 00:04:30.363 10:51:28 -- setup/hugepages.sh@27 -- # local node 00:04:30.363 10:51:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.363 10:51:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:30.363 10:51:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.363 10:51:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:30.363 10:51:28 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:30.363 10:51:28 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:30.363 10:51:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.363 10:51:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.363 10:51:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:30.363 10:51:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.363 10:51:28 -- setup/common.sh@18 -- # local node=0 00:04:30.363 10:51:28 -- setup/common.sh@19 -- # local var val 00:04:30.363 10:51:28 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.363 10:51:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.363 10:51:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:30.363 10:51:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:30.363 10:51:28 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.363 10:51:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26435008 kB' 'MemUsed: 6150360 kB' 'SwapCached: 0 kB' 'Active: 3409456 kB' 'Inactive: 184604 kB' 'Active(anon): 3238552 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318464 kB' 'Mapped: 62088 kB' 'AnonPages: 278740 kB' 'Shmem: 2962956 kB' 'KernelStack: 11832 kB' 'PageTables: 3748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121128 kB' 'Slab: 436688 kB' 'SReclaimable: 121128 kB' 'SUnreclaim: 315560 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.363 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.363 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@33 -- # echo 0 00:04:30.364 10:51:28 -- setup/common.sh@33 -- # return 0 00:04:30.364 10:51:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:30.364 10:51:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.364 10:51:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.364 10:51:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:30.364 10:51:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.364 10:51:28 -- setup/common.sh@18 -- # local node=1 00:04:30.364 10:51:28 -- setup/common.sh@19 -- # local var val 00:04:30.364 10:51:28 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.364 10:51:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.364 10:51:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:30.364 10:51:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:30.364 10:51:28 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.364 10:51:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698408 kB' 'MemFree: 14823692 kB' 'MemUsed: 12874716 kB' 'SwapCached: 0 kB' 'Active: 6315652 kB' 'Inactive: 3504340 kB' 'Active(anon): 6069672 kB' 'Inactive(anon): 0 kB' 'Active(file): 245980 kB' 'Inactive(file): 3504340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9508828 kB' 'Mapped: 91556 kB' 'AnonPages: 311216 kB' 'Shmem: 5758508 kB' 'KernelStack: 9880 kB' 'PageTables: 3692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111428 kB' 'Slab: 463808 kB' 'SReclaimable: 111428 kB' 'SUnreclaim: 352380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.364 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.364 10:51:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # continue 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.365 10:51:28 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.365 10:51:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.365 10:51:28 -- setup/common.sh@33 -- # echo 0 00:04:30.365 10:51:28 -- setup/common.sh@33 -- # return 0 00:04:30.365 10:51:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:30.365 10:51:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:30.365 10:51:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:30.365 10:51:28 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:30.365 node0=512 expecting 512 00:04:30.365 10:51:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:30.365 10:51:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:30.365 10:51:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:30.365 10:51:28 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:30.365 node1=512 expecting 512 00:04:30.365 10:51:28 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:30.365 00:04:30.365 real 0m3.674s 00:04:30.365 user 0m1.368s 00:04:30.365 sys 0m2.360s 00:04:30.365 10:51:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:30.365 10:51:28 -- common/autotest_common.sh@10 -- # set +x 00:04:30.365 ************************************ 00:04:30.365 END TEST per_node_1G_alloc 00:04:30.365 ************************************ 00:04:30.365 10:51:28 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:30.365 10:51:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:30.365 10:51:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:30.365 10:51:28 -- common/autotest_common.sh@10 -- # set +x 00:04:30.365 ************************************ 00:04:30.365 START TEST even_2G_alloc 00:04:30.365 ************************************ 00:04:30.365 10:51:28 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:30.365 10:51:28 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:30.365 10:51:28 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:30.365 10:51:28 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:30.365 10:51:28 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:30.365 10:51:28 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:30.365 10:51:28 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:30.365 10:51:28 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:30.365 10:51:28 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:30.365 10:51:28 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:30.365 10:51:28 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:30.365 10:51:28 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:30.365 10:51:28 -- setup/hugepages.sh@83 -- # : 512 00:04:30.365 10:51:28 -- setup/hugepages.sh@84 -- # : 1 00:04:30.365 10:51:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:30.365 10:51:28 -- setup/hugepages.sh@83 -- # : 0 00:04:30.365 10:51:28 -- setup/hugepages.sh@84 -- # : 0 00:04:30.365 10:51:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:30.365 10:51:28 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:30.365 10:51:28 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:30.365 10:51:28 -- setup/hugepages.sh@153 -- # setup output 00:04:30.365 10:51:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.365 10:51:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:33.650 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:33.650 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:33.912 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:33.912 10:51:32 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:33.912 10:51:32 -- setup/hugepages.sh@89 -- # local node 00:04:33.912 10:51:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:33.912 10:51:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:33.912 10:51:32 -- setup/hugepages.sh@92 -- # local surp 00:04:33.912 10:51:32 -- setup/hugepages.sh@93 -- # local resv 00:04:33.912 10:51:32 -- setup/hugepages.sh@94 -- # local anon 00:04:33.912 10:51:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:33.912 10:51:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:33.912 10:51:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:33.912 10:51:32 -- setup/common.sh@18 -- # local node= 00:04:33.912 10:51:32 -- setup/common.sh@19 -- # local var val 00:04:33.912 10:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:33.912 10:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.912 10:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.912 10:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.912 10:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.912 10:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41270368 kB' 'MemAvailable: 44995608 kB' 'Buffers: 9316 kB' 'Cached: 12818048 kB' 'SwapCached: 0 kB' 'Active: 9727488 kB' 'Inactive: 3688944 kB' 'Active(anon): 9310604 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592092 kB' 'Mapped: 152552 kB' 'Shmem: 8721536 kB' 'KReclaimable: 232556 kB' 'Slab: 900104 kB' 'SReclaimable: 232556 kB' 'SUnreclaim: 667548 kB' 'KernelStack: 21808 kB' 'PageTables: 7732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10586112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.912 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.912 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.913 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.913 10:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.913 10:51:32 -- setup/common.sh@33 -- # echo 0 00:04:33.913 10:51:32 -- setup/common.sh@33 -- # return 0 00:04:33.913 10:51:32 -- setup/hugepages.sh@97 -- # anon=0 00:04:33.913 10:51:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:33.913 10:51:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.914 10:51:32 -- setup/common.sh@18 -- # local node= 00:04:33.914 10:51:32 -- setup/common.sh@19 -- # local var val 00:04:33.914 10:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:33.914 10:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.914 10:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.914 10:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.914 10:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.914 10:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41270668 kB' 'MemAvailable: 44995904 kB' 'Buffers: 9316 kB' 'Cached: 12818052 kB' 'SwapCached: 0 kB' 'Active: 9727192 kB' 'Inactive: 3688944 kB' 'Active(anon): 9310308 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591884 kB' 'Mapped: 152520 kB' 'Shmem: 8721540 kB' 'KReclaimable: 232548 kB' 'Slab: 900080 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667532 kB' 'KernelStack: 21712 kB' 'PageTables: 7452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.914 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.914 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.915 10:51:32 -- setup/common.sh@33 -- # echo 0 00:04:33.915 10:51:32 -- setup/common.sh@33 -- # return 0 00:04:33.915 10:51:32 -- setup/hugepages.sh@99 -- # surp=0 00:04:33.915 10:51:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:33.915 10:51:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:33.915 10:51:32 -- setup/common.sh@18 -- # local node= 00:04:33.915 10:51:32 -- setup/common.sh@19 -- # local var val 00:04:33.915 10:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:33.915 10:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.915 10:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.915 10:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.915 10:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.915 10:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41271108 kB' 'MemAvailable: 44996344 kB' 'Buffers: 9316 kB' 'Cached: 12818056 kB' 'SwapCached: 0 kB' 'Active: 9726324 kB' 'Inactive: 3688944 kB' 'Active(anon): 9309440 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591468 kB' 'Mapped: 152444 kB' 'Shmem: 8721544 kB' 'KReclaimable: 232548 kB' 'Slab: 900084 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667536 kB' 'KernelStack: 21824 kB' 'PageTables: 7400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584624 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.915 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.915 10:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.916 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.916 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.916 10:51:32 -- setup/common.sh@33 -- # echo 0 00:04:33.916 10:51:32 -- setup/common.sh@33 -- # return 0 00:04:33.916 10:51:32 -- setup/hugepages.sh@100 -- # resv=0 00:04:33.916 10:51:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:33.916 nr_hugepages=1024 00:04:33.916 10:51:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:33.916 resv_hugepages=0 00:04:33.916 10:51:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:33.916 surplus_hugepages=0 00:04:33.916 10:51:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:33.916 anon_hugepages=0 00:04:33.916 10:51:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:33.916 10:51:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:33.916 10:51:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:33.916 10:51:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:33.916 10:51:32 -- setup/common.sh@18 -- # local node= 00:04:33.916 10:51:32 -- setup/common.sh@19 -- # local var val 00:04:33.916 10:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:33.917 10:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.917 10:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.917 10:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.917 10:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.917 10:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41272248 kB' 'MemAvailable: 44997484 kB' 'Buffers: 9316 kB' 'Cached: 12818076 kB' 'SwapCached: 0 kB' 'Active: 9726720 kB' 'Inactive: 3688944 kB' 'Active(anon): 9309836 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591804 kB' 'Mapped: 152444 kB' 'Shmem: 8721564 kB' 'KReclaimable: 232548 kB' 'Slab: 900084 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667536 kB' 'KernelStack: 21872 kB' 'PageTables: 7832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10586152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.917 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.917 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # continue 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:33.918 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:33.918 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.918 10:51:32 -- setup/common.sh@33 -- # echo 1024 00:04:33.918 10:51:32 -- setup/common.sh@33 -- # return 0 00:04:33.918 10:51:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:33.918 10:51:32 -- setup/hugepages.sh@112 -- # get_nodes 00:04:33.918 10:51:32 -- setup/hugepages.sh@27 -- # local node 00:04:33.918 10:51:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.918 10:51:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:33.918 10:51:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.918 10:51:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:33.918 10:51:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:33.918 10:51:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:33.918 10:51:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.178 10:51:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.178 10:51:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:34.178 10:51:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.178 10:51:32 -- setup/common.sh@18 -- # local node=0 00:04:34.178 10:51:32 -- setup/common.sh@19 -- # local var val 00:04:34.178 10:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.178 10:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.178 10:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:34.178 10:51:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:34.178 10:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.178 10:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.178 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.178 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.178 10:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26437284 kB' 'MemUsed: 6148084 kB' 'SwapCached: 0 kB' 'Active: 3412280 kB' 'Inactive: 184604 kB' 'Active(anon): 3241376 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318524 kB' 'Mapped: 61476 kB' 'AnonPages: 281696 kB' 'Shmem: 2963016 kB' 'KernelStack: 11848 kB' 'PageTables: 3788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121120 kB' 'Slab: 436300 kB' 'SReclaimable: 121120 kB' 'SUnreclaim: 315180 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:34.178 10:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.178 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.179 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.179 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@33 -- # echo 0 00:04:34.180 10:51:32 -- setup/common.sh@33 -- # return 0 00:04:34.180 10:51:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.180 10:51:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.180 10:51:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.180 10:51:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:34.180 10:51:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.180 10:51:32 -- setup/common.sh@18 -- # local node=1 00:04:34.180 10:51:32 -- setup/common.sh@19 -- # local var val 00:04:34.180 10:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.180 10:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.180 10:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:34.180 10:51:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:34.180 10:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.180 10:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698408 kB' 'MemFree: 14834380 kB' 'MemUsed: 12864028 kB' 'SwapCached: 0 kB' 'Active: 6314572 kB' 'Inactive: 3504340 kB' 'Active(anon): 6068592 kB' 'Inactive(anon): 0 kB' 'Active(file): 245980 kB' 'Inactive(file): 3504340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9508888 kB' 'Mapped: 90968 kB' 'AnonPages: 310268 kB' 'Shmem: 5758568 kB' 'KernelStack: 9960 kB' 'PageTables: 3912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111428 kB' 'Slab: 463788 kB' 'SReclaimable: 111428 kB' 'SUnreclaim: 352360 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.180 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.180 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # continue 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.181 10:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.181 10:51:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.181 10:51:32 -- setup/common.sh@33 -- # echo 0 00:04:34.181 10:51:32 -- setup/common.sh@33 -- # return 0 00:04:34.181 10:51:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.181 10:51:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.181 10:51:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.181 10:51:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.181 10:51:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:34.181 node0=512 expecting 512 00:04:34.181 10:51:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.181 10:51:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.181 10:51:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.181 10:51:32 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:34.181 node1=512 expecting 512 00:04:34.181 10:51:32 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:34.181 00:04:34.181 real 0m3.651s 00:04:34.181 user 0m1.399s 00:04:34.181 sys 0m2.323s 00:04:34.181 10:51:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:34.181 10:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:34.181 ************************************ 00:04:34.181 END TEST even_2G_alloc 00:04:34.181 ************************************ 00:04:34.181 10:51:32 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:34.181 10:51:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:34.181 10:51:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:34.181 10:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:34.181 ************************************ 00:04:34.181 START TEST odd_alloc 00:04:34.181 ************************************ 00:04:34.181 10:51:32 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:34.181 10:51:32 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:34.181 10:51:32 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:34.181 10:51:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:34.181 10:51:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:34.181 10:51:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:34.181 10:51:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:34.181 10:51:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:34.181 10:51:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:34.181 10:51:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:34.181 10:51:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:34.182 10:51:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:34.182 10:51:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:34.182 10:51:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:34.182 10:51:32 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:34.182 10:51:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:34.182 10:51:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:34.182 10:51:32 -- setup/hugepages.sh@83 -- # : 513 00:04:34.182 10:51:32 -- setup/hugepages.sh@84 -- # : 1 00:04:34.182 10:51:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:34.182 10:51:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:34.182 10:51:32 -- setup/hugepages.sh@83 -- # : 0 00:04:34.182 10:51:32 -- setup/hugepages.sh@84 -- # : 0 00:04:34.182 10:51:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:34.182 10:51:32 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:34.182 10:51:32 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:34.182 10:51:32 -- setup/hugepages.sh@160 -- # setup output 00:04:34.182 10:51:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.182 10:51:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:37.467 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:37.467 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:37.467 10:51:35 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:37.467 10:51:35 -- setup/hugepages.sh@89 -- # local node 00:04:37.467 10:51:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:37.467 10:51:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:37.467 10:51:35 -- setup/hugepages.sh@92 -- # local surp 00:04:37.467 10:51:35 -- setup/hugepages.sh@93 -- # local resv 00:04:37.467 10:51:35 -- setup/hugepages.sh@94 -- # local anon 00:04:37.468 10:51:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:37.468 10:51:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:37.468 10:51:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:37.468 10:51:35 -- setup/common.sh@18 -- # local node= 00:04:37.468 10:51:35 -- setup/common.sh@19 -- # local var val 00:04:37.468 10:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.468 10:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.468 10:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.468 10:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.468 10:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.468 10:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41284956 kB' 'MemAvailable: 45010192 kB' 'Buffers: 9316 kB' 'Cached: 12818176 kB' 'SwapCached: 0 kB' 'Active: 9728024 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311140 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592816 kB' 'Mapped: 152488 kB' 'Shmem: 8721664 kB' 'KReclaimable: 232548 kB' 'Slab: 899484 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 666936 kB' 'KernelStack: 21856 kB' 'PageTables: 7644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 10586764 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.468 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.468 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.469 10:51:35 -- setup/common.sh@33 -- # echo 0 00:04:37.469 10:51:35 -- setup/common.sh@33 -- # return 0 00:04:37.469 10:51:35 -- setup/hugepages.sh@97 -- # anon=0 00:04:37.469 10:51:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:37.469 10:51:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.469 10:51:35 -- setup/common.sh@18 -- # local node= 00:04:37.469 10:51:35 -- setup/common.sh@19 -- # local var val 00:04:37.469 10:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.469 10:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.469 10:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.469 10:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.469 10:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.469 10:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41283496 kB' 'MemAvailable: 45008732 kB' 'Buffers: 9316 kB' 'Cached: 12818180 kB' 'SwapCached: 0 kB' 'Active: 9728032 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311148 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592856 kB' 'Mapped: 152456 kB' 'Shmem: 8721668 kB' 'KReclaimable: 232548 kB' 'Slab: 899484 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 666936 kB' 'KernelStack: 21920 kB' 'PageTables: 8112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 10588264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.469 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.469 10:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.470 10:51:35 -- setup/common.sh@33 -- # echo 0 00:04:37.470 10:51:35 -- setup/common.sh@33 -- # return 0 00:04:37.470 10:51:35 -- setup/hugepages.sh@99 -- # surp=0 00:04:37.470 10:51:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:37.470 10:51:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:37.470 10:51:35 -- setup/common.sh@18 -- # local node= 00:04:37.470 10:51:35 -- setup/common.sh@19 -- # local var val 00:04:37.470 10:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.470 10:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.470 10:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.470 10:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.470 10:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.470 10:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.470 10:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41281828 kB' 'MemAvailable: 45007064 kB' 'Buffers: 9316 kB' 'Cached: 12818180 kB' 'SwapCached: 0 kB' 'Active: 9727592 kB' 'Inactive: 3688944 kB' 'Active(anon): 9310708 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592480 kB' 'Mapped: 152456 kB' 'Shmem: 8721668 kB' 'KReclaimable: 232548 kB' 'Slab: 899612 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667064 kB' 'KernelStack: 21872 kB' 'PageTables: 7652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 10586792 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.470 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.470 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.471 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.471 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.471 10:51:36 -- setup/common.sh@33 -- # echo 0 00:04:37.471 10:51:36 -- setup/common.sh@33 -- # return 0 00:04:37.471 10:51:36 -- setup/hugepages.sh@100 -- # resv=0 00:04:37.471 10:51:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:37.471 nr_hugepages=1025 00:04:37.471 10:51:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:37.471 resv_hugepages=0 00:04:37.471 10:51:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:37.472 surplus_hugepages=0 00:04:37.472 10:51:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:37.472 anon_hugepages=0 00:04:37.472 10:51:36 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:37.472 10:51:36 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:37.472 10:51:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:37.472 10:51:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:37.472 10:51:36 -- setup/common.sh@18 -- # local node= 00:04:37.472 10:51:36 -- setup/common.sh@19 -- # local var val 00:04:37.472 10:51:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.472 10:51:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.472 10:51:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.472 10:51:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.472 10:51:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.472 10:51:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41281536 kB' 'MemAvailable: 45006772 kB' 'Buffers: 9316 kB' 'Cached: 12818204 kB' 'SwapCached: 0 kB' 'Active: 9727532 kB' 'Inactive: 3688944 kB' 'Active(anon): 9310648 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592768 kB' 'Mapped: 152456 kB' 'Shmem: 8721692 kB' 'KReclaimable: 232548 kB' 'Slab: 899704 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667156 kB' 'KernelStack: 21888 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480892 kB' 'Committed_AS: 10585292 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.472 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.472 10:51:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.473 10:51:36 -- setup/common.sh@33 -- # echo 1025 00:04:37.473 10:51:36 -- setup/common.sh@33 -- # return 0 00:04:37.473 10:51:36 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:37.473 10:51:36 -- setup/hugepages.sh@112 -- # get_nodes 00:04:37.473 10:51:36 -- setup/hugepages.sh@27 -- # local node 00:04:37.473 10:51:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:37.473 10:51:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:37.473 10:51:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:37.473 10:51:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:37.473 10:51:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:37.473 10:51:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:37.473 10:51:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:37.473 10:51:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:37.473 10:51:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:37.473 10:51:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.473 10:51:36 -- setup/common.sh@18 -- # local node=0 00:04:37.473 10:51:36 -- setup/common.sh@19 -- # local var val 00:04:37.473 10:51:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.473 10:51:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.473 10:51:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:37.473 10:51:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:37.473 10:51:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.473 10:51:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.473 10:51:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26448160 kB' 'MemUsed: 6137208 kB' 'SwapCached: 0 kB' 'Active: 3413428 kB' 'Inactive: 184604 kB' 'Active(anon): 3242524 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318604 kB' 'Mapped: 61488 kB' 'AnonPages: 282632 kB' 'Shmem: 2963096 kB' 'KernelStack: 11816 kB' 'PageTables: 3696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121120 kB' 'Slab: 435920 kB' 'SReclaimable: 121120 kB' 'SUnreclaim: 314800 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.473 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.473 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.733 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.733 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@33 -- # echo 0 00:04:37.734 10:51:36 -- setup/common.sh@33 -- # return 0 00:04:37.734 10:51:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:37.734 10:51:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:37.734 10:51:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:37.734 10:51:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:37.734 10:51:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.734 10:51:36 -- setup/common.sh@18 -- # local node=1 00:04:37.734 10:51:36 -- setup/common.sh@19 -- # local var val 00:04:37.734 10:51:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.734 10:51:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.734 10:51:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:37.734 10:51:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:37.734 10:51:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.734 10:51:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698408 kB' 'MemFree: 14835048 kB' 'MemUsed: 12863360 kB' 'SwapCached: 0 kB' 'Active: 6313356 kB' 'Inactive: 3504340 kB' 'Active(anon): 6067376 kB' 'Inactive(anon): 0 kB' 'Active(file): 245980 kB' 'Inactive(file): 3504340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9508932 kB' 'Mapped: 90968 kB' 'AnonPages: 308908 kB' 'Shmem: 5758612 kB' 'KernelStack: 9928 kB' 'PageTables: 3820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111428 kB' 'Slab: 463824 kB' 'SReclaimable: 111428 kB' 'SUnreclaim: 352396 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.734 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.734 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # continue 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.735 10:51:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.735 10:51:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.735 10:51:36 -- setup/common.sh@33 -- # echo 0 00:04:37.735 10:51:36 -- setup/common.sh@33 -- # return 0 00:04:37.735 10:51:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:37.735 10:51:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:37.735 10:51:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:37.735 10:51:36 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:37.735 node0=512 expecting 513 00:04:37.735 10:51:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:37.735 10:51:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:37.735 10:51:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:37.735 10:51:36 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:37.735 node1=513 expecting 512 00:04:37.735 10:51:36 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:37.735 00:04:37.735 real 0m3.511s 00:04:37.735 user 0m1.381s 00:04:37.735 sys 0m2.204s 00:04:37.735 10:51:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:37.735 10:51:36 -- common/autotest_common.sh@10 -- # set +x 00:04:37.735 ************************************ 00:04:37.735 END TEST odd_alloc 00:04:37.735 ************************************ 00:04:37.735 10:51:36 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:37.735 10:51:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:37.735 10:51:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:37.735 10:51:36 -- common/autotest_common.sh@10 -- # set +x 00:04:37.735 ************************************ 00:04:37.735 START TEST custom_alloc 00:04:37.735 ************************************ 00:04:37.735 10:51:36 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:37.735 10:51:36 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:37.735 10:51:36 -- setup/hugepages.sh@169 -- # local node 00:04:37.735 10:51:36 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:37.735 10:51:36 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:37.735 10:51:36 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:37.735 10:51:36 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:37.735 10:51:36 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:37.735 10:51:36 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:37.735 10:51:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:37.735 10:51:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:37.735 10:51:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:37.735 10:51:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:37.735 10:51:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:37.735 10:51:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:37.735 10:51:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:37.735 10:51:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:37.735 10:51:36 -- setup/hugepages.sh@83 -- # : 256 00:04:37.735 10:51:36 -- setup/hugepages.sh@84 -- # : 1 00:04:37.735 10:51:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:37.735 10:51:36 -- setup/hugepages.sh@83 -- # : 0 00:04:37.735 10:51:36 -- setup/hugepages.sh@84 -- # : 0 00:04:37.735 10:51:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:37.735 10:51:36 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:37.735 10:51:36 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:37.735 10:51:36 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:37.735 10:51:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:37.735 10:51:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:37.735 10:51:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:37.735 10:51:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:37.735 10:51:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:37.735 10:51:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:37.735 10:51:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:37.735 10:51:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:37.735 10:51:36 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:37.736 10:51:36 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:37.736 10:51:36 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:37.736 10:51:36 -- setup/hugepages.sh@78 -- # return 0 00:04:37.736 10:51:36 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:37.736 10:51:36 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:37.736 10:51:36 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:37.736 10:51:36 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:37.736 10:51:36 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:37.736 10:51:36 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:37.736 10:51:36 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:37.736 10:51:36 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:37.736 10:51:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:37.736 10:51:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:37.736 10:51:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:37.736 10:51:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:37.736 10:51:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:37.736 10:51:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:37.736 10:51:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:37.736 10:51:36 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:37.736 10:51:36 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:37.736 10:51:36 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:37.736 10:51:36 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:37.736 10:51:36 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:37.736 10:51:36 -- setup/hugepages.sh@78 -- # return 0 00:04:37.736 10:51:36 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:37.736 10:51:36 -- setup/hugepages.sh@187 -- # setup output 00:04:37.736 10:51:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.736 10:51:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:41.023 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.023 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:41.023 10:51:39 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:41.023 10:51:39 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:41.023 10:51:39 -- setup/hugepages.sh@89 -- # local node 00:04:41.023 10:51:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:41.023 10:51:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:41.023 10:51:39 -- setup/hugepages.sh@92 -- # local surp 00:04:41.023 10:51:39 -- setup/hugepages.sh@93 -- # local resv 00:04:41.023 10:51:39 -- setup/hugepages.sh@94 -- # local anon 00:04:41.023 10:51:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:41.023 10:51:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:41.023 10:51:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:41.023 10:51:39 -- setup/common.sh@18 -- # local node= 00:04:41.023 10:51:39 -- setup/common.sh@19 -- # local var val 00:04:41.023 10:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:41.023 10:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.023 10:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.023 10:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.023 10:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.023 10:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 40245228 kB' 'MemAvailable: 43970464 kB' 'Buffers: 9316 kB' 'Cached: 12818312 kB' 'SwapCached: 0 kB' 'Active: 9728240 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311356 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592884 kB' 'Mapped: 152548 kB' 'Shmem: 8721800 kB' 'KReclaimable: 232548 kB' 'Slab: 899848 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667300 kB' 'KernelStack: 21776 kB' 'PageTables: 7596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 10583836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.023 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.023 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.024 10:51:39 -- setup/common.sh@33 -- # echo 0 00:04:41.024 10:51:39 -- setup/common.sh@33 -- # return 0 00:04:41.024 10:51:39 -- setup/hugepages.sh@97 -- # anon=0 00:04:41.024 10:51:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:41.024 10:51:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.024 10:51:39 -- setup/common.sh@18 -- # local node= 00:04:41.024 10:51:39 -- setup/common.sh@19 -- # local var val 00:04:41.024 10:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:41.024 10:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.024 10:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.024 10:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.024 10:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.024 10:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 40239072 kB' 'MemAvailable: 43964308 kB' 'Buffers: 9316 kB' 'Cached: 12818316 kB' 'SwapCached: 0 kB' 'Active: 9732724 kB' 'Inactive: 3688944 kB' 'Active(anon): 9315840 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597356 kB' 'Mapped: 152968 kB' 'Shmem: 8721804 kB' 'KReclaimable: 232548 kB' 'Slab: 899836 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667288 kB' 'KernelStack: 21760 kB' 'PageTables: 7508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 10589008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.024 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.024 10:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.025 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.025 10:51:39 -- setup/common.sh@33 -- # echo 0 00:04:41.025 10:51:39 -- setup/common.sh@33 -- # return 0 00:04:41.025 10:51:39 -- setup/hugepages.sh@99 -- # surp=0 00:04:41.025 10:51:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:41.025 10:51:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:41.025 10:51:39 -- setup/common.sh@18 -- # local node= 00:04:41.025 10:51:39 -- setup/common.sh@19 -- # local var val 00:04:41.025 10:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:41.025 10:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.025 10:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.025 10:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.025 10:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.025 10:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.025 10:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 40239700 kB' 'MemAvailable: 43964936 kB' 'Buffers: 9316 kB' 'Cached: 12818328 kB' 'SwapCached: 0 kB' 'Active: 9728216 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311332 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592888 kB' 'Mapped: 153260 kB' 'Shmem: 8721816 kB' 'KReclaimable: 232548 kB' 'Slab: 899836 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667288 kB' 'KernelStack: 21760 kB' 'PageTables: 7524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 10584128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.025 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.026 10:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.287 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.287 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.288 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.288 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.288 10:51:39 -- setup/common.sh@33 -- # echo 0 00:04:41.288 10:51:39 -- setup/common.sh@33 -- # return 0 00:04:41.288 10:51:39 -- setup/hugepages.sh@100 -- # resv=0 00:04:41.288 10:51:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:41.288 nr_hugepages=1536 00:04:41.288 10:51:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:41.288 resv_hugepages=0 00:04:41.288 10:51:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:41.288 surplus_hugepages=0 00:04:41.288 10:51:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:41.288 anon_hugepages=0 00:04:41.288 10:51:39 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:41.288 10:51:39 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:41.288 10:51:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:41.288 10:51:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:41.289 10:51:39 -- setup/common.sh@18 -- # local node= 00:04:41.289 10:51:39 -- setup/common.sh@19 -- # local var val 00:04:41.289 10:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:41.289 10:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.289 10:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.289 10:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.289 10:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.289 10:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.289 10:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 40236836 kB' 'MemAvailable: 43962072 kB' 'Buffers: 9316 kB' 'Cached: 12818340 kB' 'SwapCached: 0 kB' 'Active: 9733256 kB' 'Inactive: 3688944 kB' 'Active(anon): 9316372 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597880 kB' 'Mapped: 152968 kB' 'Shmem: 8721828 kB' 'KReclaimable: 232548 kB' 'Slab: 899836 kB' 'SReclaimable: 232548 kB' 'SUnreclaim: 667288 kB' 'KernelStack: 21760 kB' 'PageTables: 7508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957628 kB' 'Committed_AS: 10589036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214308 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.289 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.289 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.290 10:51:39 -- setup/common.sh@33 -- # echo 1536 00:04:41.290 10:51:39 -- setup/common.sh@33 -- # return 0 00:04:41.290 10:51:39 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:41.290 10:51:39 -- setup/hugepages.sh@112 -- # get_nodes 00:04:41.290 10:51:39 -- setup/hugepages.sh@27 -- # local node 00:04:41.290 10:51:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.290 10:51:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:41.290 10:51:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.290 10:51:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:41.290 10:51:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:41.290 10:51:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:41.290 10:51:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:41.290 10:51:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:41.290 10:51:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:41.290 10:51:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.290 10:51:39 -- setup/common.sh@18 -- # local node=0 00:04:41.290 10:51:39 -- setup/common.sh@19 -- # local var val 00:04:41.290 10:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:41.290 10:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.290 10:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:41.290 10:51:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:41.290 10:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.290 10:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26454260 kB' 'MemUsed: 6131108 kB' 'SwapCached: 0 kB' 'Active: 3415284 kB' 'Inactive: 184604 kB' 'Active(anon): 3244380 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318688 kB' 'Mapped: 61496 kB' 'AnonPages: 284408 kB' 'Shmem: 2963180 kB' 'KernelStack: 11816 kB' 'PageTables: 3648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121120 kB' 'Slab: 435852 kB' 'SReclaimable: 121120 kB' 'SUnreclaim: 314732 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.290 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.290 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@33 -- # echo 0 00:04:41.291 10:51:39 -- setup/common.sh@33 -- # return 0 00:04:41.291 10:51:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:41.291 10:51:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:41.291 10:51:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:41.291 10:51:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:41.291 10:51:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.291 10:51:39 -- setup/common.sh@18 -- # local node=1 00:04:41.291 10:51:39 -- setup/common.sh@19 -- # local var val 00:04:41.291 10:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:41.291 10:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.291 10:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:41.291 10:51:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:41.291 10:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.291 10:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698408 kB' 'MemFree: 13789924 kB' 'MemUsed: 13908484 kB' 'SwapCached: 0 kB' 'Active: 6312248 kB' 'Inactive: 3504340 kB' 'Active(anon): 6066268 kB' 'Inactive(anon): 0 kB' 'Active(file): 245980 kB' 'Inactive(file): 3504340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9508984 kB' 'Mapped: 90968 kB' 'AnonPages: 307768 kB' 'Shmem: 5758664 kB' 'KernelStack: 9928 kB' 'PageTables: 3808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111428 kB' 'Slab: 463984 kB' 'SReclaimable: 111428 kB' 'SUnreclaim: 352556 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.291 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.291 10:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # continue 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:41.292 10:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:41.292 10:51:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.292 10:51:39 -- setup/common.sh@33 -- # echo 0 00:04:41.292 10:51:39 -- setup/common.sh@33 -- # return 0 00:04:41.292 10:51:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:41.292 10:51:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:41.292 10:51:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:41.292 10:51:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:41.292 10:51:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:41.292 node0=512 expecting 512 00:04:41.292 10:51:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:41.292 10:51:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:41.292 10:51:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:41.292 10:51:39 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:41.292 node1=1024 expecting 1024 00:04:41.292 10:51:39 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:41.292 00:04:41.292 real 0m3.576s 00:04:41.292 user 0m1.364s 00:04:41.292 sys 0m2.281s 00:04:41.292 10:51:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:41.292 10:51:39 -- common/autotest_common.sh@10 -- # set +x 00:04:41.292 ************************************ 00:04:41.292 END TEST custom_alloc 00:04:41.292 ************************************ 00:04:41.292 10:51:39 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:41.292 10:51:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:41.292 10:51:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:41.292 10:51:39 -- common/autotest_common.sh@10 -- # set +x 00:04:41.292 ************************************ 00:04:41.292 START TEST no_shrink_alloc 00:04:41.292 ************************************ 00:04:41.292 10:51:39 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:41.292 10:51:39 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:41.292 10:51:39 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:41.292 10:51:39 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:41.292 10:51:39 -- setup/hugepages.sh@51 -- # shift 00:04:41.292 10:51:39 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:41.292 10:51:39 -- setup/hugepages.sh@52 -- # local node_ids 00:04:41.292 10:51:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:41.292 10:51:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:41.292 10:51:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:41.292 10:51:39 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:41.292 10:51:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:41.292 10:51:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:41.292 10:51:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:41.292 10:51:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:41.292 10:51:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:41.292 10:51:39 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:41.292 10:51:39 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:41.292 10:51:39 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:41.292 10:51:39 -- setup/hugepages.sh@73 -- # return 0 00:04:41.292 10:51:39 -- setup/hugepages.sh@198 -- # setup output 00:04:41.292 10:51:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.292 10:51:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:44.578 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:44.578 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:44.839 10:51:43 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:44.839 10:51:43 -- setup/hugepages.sh@89 -- # local node 00:04:44.839 10:51:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:44.839 10:51:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:44.839 10:51:43 -- setup/hugepages.sh@92 -- # local surp 00:04:44.839 10:51:43 -- setup/hugepages.sh@93 -- # local resv 00:04:44.839 10:51:43 -- setup/hugepages.sh@94 -- # local anon 00:04:44.839 10:51:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:44.839 10:51:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:44.839 10:51:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:44.839 10:51:43 -- setup/common.sh@18 -- # local node= 00:04:44.839 10:51:43 -- setup/common.sh@19 -- # local var val 00:04:44.839 10:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:44.839 10:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.839 10:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.839 10:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.839 10:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.839 10:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.839 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.839 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41306144 kB' 'MemAvailable: 45031376 kB' 'Buffers: 9316 kB' 'Cached: 12818440 kB' 'SwapCached: 0 kB' 'Active: 9728312 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311428 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592940 kB' 'Mapped: 152544 kB' 'Shmem: 8721928 kB' 'KReclaimable: 232540 kB' 'Slab: 899496 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 666956 kB' 'KernelStack: 21792 kB' 'PageTables: 7636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10583520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.840 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.840 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.841 10:51:43 -- setup/common.sh@33 -- # echo 0 00:04:44.841 10:51:43 -- setup/common.sh@33 -- # return 0 00:04:44.841 10:51:43 -- setup/hugepages.sh@97 -- # anon=0 00:04:44.841 10:51:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:44.841 10:51:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:44.841 10:51:43 -- setup/common.sh@18 -- # local node= 00:04:44.841 10:51:43 -- setup/common.sh@19 -- # local var val 00:04:44.841 10:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:44.841 10:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.841 10:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.841 10:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.841 10:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.841 10:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41307404 kB' 'MemAvailable: 45032636 kB' 'Buffers: 9316 kB' 'Cached: 12818440 kB' 'SwapCached: 0 kB' 'Active: 9729456 kB' 'Inactive: 3688944 kB' 'Active(anon): 9312572 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594160 kB' 'Mapped: 152544 kB' 'Shmem: 8721928 kB' 'KReclaimable: 232540 kB' 'Slab: 899440 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 666900 kB' 'KernelStack: 21840 kB' 'PageTables: 7760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214224 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.841 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.841 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.842 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.842 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.843 10:51:43 -- setup/common.sh@33 -- # echo 0 00:04:44.843 10:51:43 -- setup/common.sh@33 -- # return 0 00:04:44.843 10:51:43 -- setup/hugepages.sh@99 -- # surp=0 00:04:44.843 10:51:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:44.843 10:51:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:44.843 10:51:43 -- setup/common.sh@18 -- # local node= 00:04:44.843 10:51:43 -- setup/common.sh@19 -- # local var val 00:04:44.843 10:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:44.843 10:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.843 10:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.843 10:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.843 10:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.843 10:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41307252 kB' 'MemAvailable: 45032484 kB' 'Buffers: 9316 kB' 'Cached: 12818456 kB' 'SwapCached: 0 kB' 'Active: 9729012 kB' 'Inactive: 3688944 kB' 'Active(anon): 9312128 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593668 kB' 'Mapped: 152464 kB' 'Shmem: 8721944 kB' 'KReclaimable: 232540 kB' 'Slab: 899412 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 666872 kB' 'KernelStack: 21856 kB' 'PageTables: 7440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10586952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.843 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.843 10:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.844 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.844 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.845 10:51:43 -- setup/common.sh@33 -- # echo 0 00:04:44.845 10:51:43 -- setup/common.sh@33 -- # return 0 00:04:44.845 10:51:43 -- setup/hugepages.sh@100 -- # resv=0 00:04:44.845 10:51:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:44.845 nr_hugepages=1024 00:04:44.845 10:51:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:44.845 resv_hugepages=0 00:04:44.845 10:51:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:44.845 surplus_hugepages=0 00:04:44.845 10:51:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:44.845 anon_hugepages=0 00:04:44.845 10:51:43 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:44.845 10:51:43 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:44.845 10:51:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:44.845 10:51:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:44.845 10:51:43 -- setup/common.sh@18 -- # local node= 00:04:44.845 10:51:43 -- setup/common.sh@19 -- # local var val 00:04:44.845 10:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:44.845 10:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.845 10:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.845 10:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.845 10:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.845 10:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41308488 kB' 'MemAvailable: 45033720 kB' 'Buffers: 9316 kB' 'Cached: 12818468 kB' 'SwapCached: 0 kB' 'Active: 9728744 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311860 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593308 kB' 'Mapped: 152464 kB' 'Shmem: 8721956 kB' 'KReclaimable: 232540 kB' 'Slab: 899404 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 666864 kB' 'KernelStack: 21744 kB' 'PageTables: 7776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10588112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.845 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.845 10:51:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.846 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.846 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.847 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.847 10:51:43 -- setup/common.sh@33 -- # echo 1024 00:04:44.847 10:51:43 -- setup/common.sh@33 -- # return 0 00:04:44.847 10:51:43 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:44.847 10:51:43 -- setup/hugepages.sh@112 -- # get_nodes 00:04:44.847 10:51:43 -- setup/hugepages.sh@27 -- # local node 00:04:44.847 10:51:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:44.847 10:51:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:44.847 10:51:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:44.847 10:51:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:44.847 10:51:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:44.847 10:51:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:44.847 10:51:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:44.847 10:51:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:44.847 10:51:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:44.847 10:51:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:44.847 10:51:43 -- setup/common.sh@18 -- # local node=0 00:04:44.847 10:51:43 -- setup/common.sh@19 -- # local var val 00:04:44.847 10:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:44.847 10:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.847 10:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:44.847 10:51:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:44.847 10:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.847 10:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.847 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25423676 kB' 'MemUsed: 7161692 kB' 'SwapCached: 0 kB' 'Active: 3418724 kB' 'Inactive: 184604 kB' 'Active(anon): 3247820 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318784 kB' 'Mapped: 61496 kB' 'AnonPages: 287952 kB' 'Shmem: 2963276 kB' 'KernelStack: 11928 kB' 'PageTables: 4020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121120 kB' 'Slab: 435312 kB' 'SReclaimable: 121120 kB' 'SUnreclaim: 314192 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # continue 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:44.848 10:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:44.848 10:51:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.848 10:51:43 -- setup/common.sh@33 -- # echo 0 00:04:44.848 10:51:43 -- setup/common.sh@33 -- # return 0 00:04:44.848 10:51:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:44.848 10:51:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:44.848 10:51:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:44.848 10:51:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:44.848 10:51:43 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:44.848 node0=1024 expecting 1024 00:04:44.848 10:51:43 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:44.848 10:51:43 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:44.849 10:51:43 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:44.849 10:51:43 -- setup/hugepages.sh@202 -- # setup output 00:04:44.849 10:51:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:44.849 10:51:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:49.041 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:49.041 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:49.041 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:49.041 10:51:46 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:49.041 10:51:46 -- setup/hugepages.sh@89 -- # local node 00:04:49.041 10:51:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:49.041 10:51:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:49.041 10:51:46 -- setup/hugepages.sh@92 -- # local surp 00:04:49.041 10:51:46 -- setup/hugepages.sh@93 -- # local resv 00:04:49.041 10:51:46 -- setup/hugepages.sh@94 -- # local anon 00:04:49.041 10:51:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:49.041 10:51:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:49.041 10:51:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:49.041 10:51:46 -- setup/common.sh@18 -- # local node= 00:04:49.041 10:51:46 -- setup/common.sh@19 -- # local var val 00:04:49.041 10:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:49.041 10:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.041 10:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.041 10:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.041 10:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.041 10:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.041 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.041 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.041 10:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41332936 kB' 'MemAvailable: 45058168 kB' 'Buffers: 9316 kB' 'Cached: 12818564 kB' 'SwapCached: 0 kB' 'Active: 9728908 kB' 'Inactive: 3688944 kB' 'Active(anon): 9312024 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593220 kB' 'Mapped: 152516 kB' 'Shmem: 8722052 kB' 'KReclaimable: 232540 kB' 'Slab: 899972 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 667432 kB' 'KernelStack: 21728 kB' 'PageTables: 7392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:49.041 10:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.041 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.041 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.041 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.041 10:51:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.042 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.042 10:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.042 10:51:46 -- setup/common.sh@33 -- # echo 0 00:04:49.042 10:51:46 -- setup/common.sh@33 -- # return 0 00:04:49.042 10:51:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:49.042 10:51:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:49.042 10:51:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:49.042 10:51:46 -- setup/common.sh@18 -- # local node= 00:04:49.042 10:51:46 -- setup/common.sh@19 -- # local var val 00:04:49.042 10:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:49.042 10:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.043 10:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.043 10:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.043 10:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.043 10:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41332224 kB' 'MemAvailable: 45057456 kB' 'Buffers: 9316 kB' 'Cached: 12818568 kB' 'SwapCached: 0 kB' 'Active: 9728824 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311940 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592772 kB' 'Mapped: 152472 kB' 'Shmem: 8722056 kB' 'KReclaimable: 232540 kB' 'Slab: 900104 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 667564 kB' 'KernelStack: 21744 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584172 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.043 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.043 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.044 10:51:46 -- setup/common.sh@33 -- # echo 0 00:04:49.044 10:51:46 -- setup/common.sh@33 -- # return 0 00:04:49.044 10:51:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:49.044 10:51:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:49.044 10:51:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:49.044 10:51:46 -- setup/common.sh@18 -- # local node= 00:04:49.044 10:51:46 -- setup/common.sh@19 -- # local var val 00:04:49.044 10:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:49.044 10:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.044 10:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.044 10:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.044 10:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.044 10:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41330964 kB' 'MemAvailable: 45056196 kB' 'Buffers: 9316 kB' 'Cached: 12818592 kB' 'SwapCached: 0 kB' 'Active: 9728524 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311640 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592772 kB' 'Mapped: 152472 kB' 'Shmem: 8722080 kB' 'KReclaimable: 232540 kB' 'Slab: 900104 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 667564 kB' 'KernelStack: 21744 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584188 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:46 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.044 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.044 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.045 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:49.045 10:51:47 -- setup/common.sh@33 -- # echo 0 00:04:49.045 10:51:47 -- setup/common.sh@33 -- # return 0 00:04:49.045 10:51:47 -- setup/hugepages.sh@100 -- # resv=0 00:04:49.045 10:51:47 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:49.045 nr_hugepages=1024 00:04:49.045 10:51:47 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:49.045 resv_hugepages=0 00:04:49.045 10:51:47 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:49.045 surplus_hugepages=0 00:04:49.045 10:51:47 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:49.045 anon_hugepages=0 00:04:49.045 10:51:47 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:49.045 10:51:47 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:49.045 10:51:47 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:49.045 10:51:47 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:49.045 10:51:47 -- setup/common.sh@18 -- # local node= 00:04:49.045 10:51:47 -- setup/common.sh@19 -- # local var val 00:04:49.045 10:51:47 -- setup/common.sh@20 -- # local mem_f mem 00:04:49.045 10:51:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.045 10:51:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.045 10:51:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.045 10:51:47 -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.045 10:51:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.045 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283776 kB' 'MemFree: 41330876 kB' 'MemAvailable: 45056108 kB' 'Buffers: 9316 kB' 'Cached: 12818592 kB' 'SwapCached: 0 kB' 'Active: 9728868 kB' 'Inactive: 3688944 kB' 'Active(anon): 9311984 kB' 'Inactive(anon): 0 kB' 'Active(file): 416884 kB' 'Inactive(file): 3688944 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593160 kB' 'Mapped: 152472 kB' 'Shmem: 8722080 kB' 'KReclaimable: 232540 kB' 'Slab: 900104 kB' 'SReclaimable: 232540 kB' 'SUnreclaim: 667564 kB' 'KernelStack: 21760 kB' 'PageTables: 7508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481916 kB' 'Committed_AS: 10584204 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 513396 kB' 'DirectMap2M: 10706944 kB' 'DirectMap1G: 58720256 kB' 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.046 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.046 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:49.047 10:51:47 -- setup/common.sh@33 -- # echo 1024 00:04:49.047 10:51:47 -- setup/common.sh@33 -- # return 0 00:04:49.047 10:51:47 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:49.047 10:51:47 -- setup/hugepages.sh@112 -- # get_nodes 00:04:49.047 10:51:47 -- setup/hugepages.sh@27 -- # local node 00:04:49.047 10:51:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:49.047 10:51:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:49.047 10:51:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:49.047 10:51:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:49.047 10:51:47 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:49.047 10:51:47 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:49.047 10:51:47 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:49.047 10:51:47 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:49.047 10:51:47 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:49.047 10:51:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:49.047 10:51:47 -- setup/common.sh@18 -- # local node=0 00:04:49.047 10:51:47 -- setup/common.sh@19 -- # local var val 00:04:49.047 10:51:47 -- setup/common.sh@20 -- # local mem_f mem 00:04:49.047 10:51:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.047 10:51:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:49.047 10:51:47 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:49.047 10:51:47 -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.047 10:51:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25438012 kB' 'MemUsed: 7147356 kB' 'SwapCached: 0 kB' 'Active: 3421060 kB' 'Inactive: 184604 kB' 'Active(anon): 3250156 kB' 'Inactive(anon): 0 kB' 'Active(file): 170904 kB' 'Inactive(file): 184604 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3318900 kB' 'Mapped: 61504 kB' 'AnonPages: 290020 kB' 'Shmem: 2963392 kB' 'KernelStack: 11880 kB' 'PageTables: 3888 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121120 kB' 'Slab: 435968 kB' 'SReclaimable: 121120 kB' 'SUnreclaim: 314848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.047 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.047 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # continue 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:49.048 10:51:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:49.048 10:51:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:49.048 10:51:47 -- setup/common.sh@33 -- # echo 0 00:04:49.048 10:51:47 -- setup/common.sh@33 -- # return 0 00:04:49.048 10:51:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:49.048 10:51:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:49.048 10:51:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:49.048 10:51:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:49.048 10:51:47 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:49.048 node0=1024 expecting 1024 00:04:49.048 10:51:47 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:49.048 00:04:49.048 real 0m7.261s 00:04:49.048 user 0m2.763s 00:04:49.048 sys 0m4.636s 00:04:49.048 10:51:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:49.048 10:51:47 -- common/autotest_common.sh@10 -- # set +x 00:04:49.048 ************************************ 00:04:49.048 END TEST no_shrink_alloc 00:04:49.048 ************************************ 00:04:49.048 10:51:47 -- setup/hugepages.sh@217 -- # clear_hp 00:04:49.048 10:51:47 -- setup/hugepages.sh@37 -- # local node hp 00:04:49.048 10:51:47 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:49.048 10:51:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.048 10:51:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:49.048 10:51:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.048 10:51:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:49.048 10:51:47 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:49.048 10:51:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.048 10:51:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:49.048 10:51:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.048 10:51:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:49.048 10:51:47 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:49.048 10:51:47 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:49.048 00:04:49.048 real 0m27.534s 00:04:49.048 user 0m9.981s 00:04:49.048 sys 0m16.542s 00:04:49.048 10:51:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:49.048 10:51:47 -- common/autotest_common.sh@10 -- # set +x 00:04:49.048 ************************************ 00:04:49.048 END TEST hugepages 00:04:49.048 ************************************ 00:04:49.048 10:51:47 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:49.048 10:51:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:49.048 10:51:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:49.048 10:51:47 -- common/autotest_common.sh@10 -- # set +x 00:04:49.048 ************************************ 00:04:49.048 START TEST driver 00:04:49.048 ************************************ 00:04:49.048 10:51:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:49.048 * Looking for test storage... 00:04:49.048 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:49.048 10:51:47 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:49.048 10:51:47 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:49.048 10:51:47 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:49.048 10:51:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:49.048 10:51:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:49.048 10:51:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:49.048 10:51:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:49.048 10:51:47 -- scripts/common.sh@335 -- # IFS=.-: 00:04:49.048 10:51:47 -- scripts/common.sh@335 -- # read -ra ver1 00:04:49.048 10:51:47 -- scripts/common.sh@336 -- # IFS=.-: 00:04:49.048 10:51:47 -- scripts/common.sh@336 -- # read -ra ver2 00:04:49.048 10:51:47 -- scripts/common.sh@337 -- # local 'op=<' 00:04:49.048 10:51:47 -- scripts/common.sh@339 -- # ver1_l=2 00:04:49.048 10:51:47 -- scripts/common.sh@340 -- # ver2_l=1 00:04:49.048 10:51:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:49.048 10:51:47 -- scripts/common.sh@343 -- # case "$op" in 00:04:49.048 10:51:47 -- scripts/common.sh@344 -- # : 1 00:04:49.048 10:51:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:49.048 10:51:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:49.048 10:51:47 -- scripts/common.sh@364 -- # decimal 1 00:04:49.048 10:51:47 -- scripts/common.sh@352 -- # local d=1 00:04:49.048 10:51:47 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:49.048 10:51:47 -- scripts/common.sh@354 -- # echo 1 00:04:49.048 10:51:47 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:49.048 10:51:47 -- scripts/common.sh@365 -- # decimal 2 00:04:49.048 10:51:47 -- scripts/common.sh@352 -- # local d=2 00:04:49.048 10:51:47 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:49.048 10:51:47 -- scripts/common.sh@354 -- # echo 2 00:04:49.048 10:51:47 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:49.048 10:51:47 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:49.048 10:51:47 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:49.048 10:51:47 -- scripts/common.sh@367 -- # return 0 00:04:49.048 10:51:47 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:49.048 10:51:47 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:49.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.048 --rc genhtml_branch_coverage=1 00:04:49.048 --rc genhtml_function_coverage=1 00:04:49.048 --rc genhtml_legend=1 00:04:49.048 --rc geninfo_all_blocks=1 00:04:49.048 --rc geninfo_unexecuted_blocks=1 00:04:49.048 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.048 ' 00:04:49.048 10:51:47 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:49.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.048 --rc genhtml_branch_coverage=1 00:04:49.048 --rc genhtml_function_coverage=1 00:04:49.048 --rc genhtml_legend=1 00:04:49.048 --rc geninfo_all_blocks=1 00:04:49.048 --rc geninfo_unexecuted_blocks=1 00:04:49.049 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.049 ' 00:04:49.049 10:51:47 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:49.049 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.049 --rc genhtml_branch_coverage=1 00:04:49.049 --rc genhtml_function_coverage=1 00:04:49.049 --rc genhtml_legend=1 00:04:49.049 --rc geninfo_all_blocks=1 00:04:49.049 --rc geninfo_unexecuted_blocks=1 00:04:49.049 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.049 ' 00:04:49.049 10:51:47 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:49.049 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.049 --rc genhtml_branch_coverage=1 00:04:49.049 --rc genhtml_function_coverage=1 00:04:49.049 --rc genhtml_legend=1 00:04:49.049 --rc geninfo_all_blocks=1 00:04:49.049 --rc geninfo_unexecuted_blocks=1 00:04:49.049 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.049 ' 00:04:49.049 10:51:47 -- setup/driver.sh@68 -- # setup reset 00:04:49.049 10:51:47 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:49.049 10:51:47 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:54.328 10:51:52 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:54.328 10:51:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:54.328 10:51:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:54.328 10:51:52 -- common/autotest_common.sh@10 -- # set +x 00:04:54.328 ************************************ 00:04:54.328 START TEST guess_driver 00:04:54.328 ************************************ 00:04:54.328 10:51:52 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:54.328 10:51:52 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:54.328 10:51:52 -- setup/driver.sh@47 -- # local fail=0 00:04:54.328 10:51:52 -- setup/driver.sh@49 -- # pick_driver 00:04:54.328 10:51:52 -- setup/driver.sh@36 -- # vfio 00:04:54.328 10:51:52 -- setup/driver.sh@21 -- # local iommu_grups 00:04:54.328 10:51:52 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:54.328 10:51:52 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:54.328 10:51:52 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:54.328 10:51:52 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:54.328 10:51:52 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:54.328 10:51:52 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:54.328 10:51:52 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:54.328 10:51:52 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:54.328 10:51:52 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:54.328 10:51:52 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:54.328 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:54.328 10:51:52 -- setup/driver.sh@30 -- # return 0 00:04:54.328 10:51:52 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:54.329 10:51:52 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:54.329 10:51:52 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:54.329 10:51:52 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:54.329 Looking for driver=vfio-pci 00:04:54.329 10:51:52 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:54.329 10:51:52 -- setup/driver.sh@45 -- # setup output config 00:04:54.329 10:51:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.329 10:51:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.976 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.976 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.976 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.257 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:57.257 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:57.257 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.257 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:57.257 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:57.257 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.257 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:57.257 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:57.257 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.257 10:51:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:57.257 10:51:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:57.257 10:51:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:58.714 10:51:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:58.714 10:51:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:58.714 10:51:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:58.714 10:51:57 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:58.714 10:51:57 -- setup/driver.sh@65 -- # setup reset 00:04:58.714 10:51:57 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:58.714 10:51:57 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:03.997 00:05:03.997 real 0m9.439s 00:05:03.997 user 0m2.449s 00:05:03.997 sys 0m4.771s 00:05:03.997 10:52:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.997 10:52:01 -- common/autotest_common.sh@10 -- # set +x 00:05:03.997 ************************************ 00:05:03.998 END TEST guess_driver 00:05:03.998 ************************************ 00:05:03.998 00:05:03.998 real 0m14.517s 00:05:03.998 user 0m3.936s 00:05:03.998 sys 0m7.612s 00:05:03.998 10:52:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.998 10:52:01 -- common/autotest_common.sh@10 -- # set +x 00:05:03.998 ************************************ 00:05:03.998 END TEST driver 00:05:03.998 ************************************ 00:05:03.998 10:52:01 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:03.998 10:52:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:03.998 10:52:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:03.998 10:52:01 -- common/autotest_common.sh@10 -- # set +x 00:05:03.998 ************************************ 00:05:03.998 START TEST devices 00:05:03.998 ************************************ 00:05:03.998 10:52:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:03.998 * Looking for test storage... 00:05:03.998 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:03.998 10:52:01 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:03.998 10:52:01 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:03.998 10:52:01 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:03.998 10:52:01 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:03.998 10:52:01 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:03.998 10:52:01 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:03.998 10:52:01 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:03.998 10:52:01 -- scripts/common.sh@335 -- # IFS=.-: 00:05:03.998 10:52:01 -- scripts/common.sh@335 -- # read -ra ver1 00:05:03.998 10:52:01 -- scripts/common.sh@336 -- # IFS=.-: 00:05:03.998 10:52:01 -- scripts/common.sh@336 -- # read -ra ver2 00:05:03.998 10:52:01 -- scripts/common.sh@337 -- # local 'op=<' 00:05:03.998 10:52:01 -- scripts/common.sh@339 -- # ver1_l=2 00:05:03.998 10:52:01 -- scripts/common.sh@340 -- # ver2_l=1 00:05:03.998 10:52:01 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:03.998 10:52:01 -- scripts/common.sh@343 -- # case "$op" in 00:05:03.998 10:52:01 -- scripts/common.sh@344 -- # : 1 00:05:03.998 10:52:01 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:03.998 10:52:01 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:03.998 10:52:01 -- scripts/common.sh@364 -- # decimal 1 00:05:03.998 10:52:01 -- scripts/common.sh@352 -- # local d=1 00:05:03.998 10:52:01 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:03.998 10:52:01 -- scripts/common.sh@354 -- # echo 1 00:05:03.998 10:52:01 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:03.998 10:52:01 -- scripts/common.sh@365 -- # decimal 2 00:05:03.998 10:52:01 -- scripts/common.sh@352 -- # local d=2 00:05:03.998 10:52:01 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:03.998 10:52:01 -- scripts/common.sh@354 -- # echo 2 00:05:03.998 10:52:01 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:03.998 10:52:01 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:03.998 10:52:01 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:03.998 10:52:01 -- scripts/common.sh@367 -- # return 0 00:05:03.998 10:52:01 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:03.998 10:52:01 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:03.998 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.998 --rc genhtml_branch_coverage=1 00:05:03.998 --rc genhtml_function_coverage=1 00:05:03.998 --rc genhtml_legend=1 00:05:03.998 --rc geninfo_all_blocks=1 00:05:03.998 --rc geninfo_unexecuted_blocks=1 00:05:03.998 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.998 ' 00:05:03.998 10:52:01 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:03.998 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.998 --rc genhtml_branch_coverage=1 00:05:03.998 --rc genhtml_function_coverage=1 00:05:03.998 --rc genhtml_legend=1 00:05:03.998 --rc geninfo_all_blocks=1 00:05:03.998 --rc geninfo_unexecuted_blocks=1 00:05:03.998 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.998 ' 00:05:03.998 10:52:01 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:03.998 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.998 --rc genhtml_branch_coverage=1 00:05:03.998 --rc genhtml_function_coverage=1 00:05:03.998 --rc genhtml_legend=1 00:05:03.998 --rc geninfo_all_blocks=1 00:05:03.998 --rc geninfo_unexecuted_blocks=1 00:05:03.998 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.998 ' 00:05:03.998 10:52:01 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:03.998 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.998 --rc genhtml_branch_coverage=1 00:05:03.998 --rc genhtml_function_coverage=1 00:05:03.998 --rc genhtml_legend=1 00:05:03.998 --rc geninfo_all_blocks=1 00:05:03.998 --rc geninfo_unexecuted_blocks=1 00:05:03.998 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.998 ' 00:05:03.998 10:52:01 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:03.998 10:52:01 -- setup/devices.sh@192 -- # setup reset 00:05:03.998 10:52:01 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:03.998 10:52:01 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:07.286 10:52:05 -- setup/devices.sh@194 -- # get_zoned_devs 00:05:07.286 10:52:05 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:05:07.286 10:52:05 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:05:07.286 10:52:05 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:05:07.286 10:52:05 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:05:07.286 10:52:05 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:05:07.286 10:52:05 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:05:07.286 10:52:05 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:07.286 10:52:05 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:05:07.286 10:52:05 -- setup/devices.sh@196 -- # blocks=() 00:05:07.286 10:52:05 -- setup/devices.sh@196 -- # declare -a blocks 00:05:07.286 10:52:05 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:07.286 10:52:05 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:07.286 10:52:05 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:07.286 10:52:05 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:07.286 10:52:05 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:07.286 10:52:05 -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:07.286 10:52:05 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:07.286 10:52:05 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:07.286 10:52:05 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:07.286 10:52:05 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:05:07.286 10:52:05 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:07.286 No valid GPT data, bailing 00:05:07.286 10:52:05 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:07.286 10:52:05 -- scripts/common.sh@393 -- # pt= 00:05:07.286 10:52:05 -- scripts/common.sh@394 -- # return 1 00:05:07.286 10:52:05 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:07.286 10:52:05 -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:07.286 10:52:05 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:07.286 10:52:05 -- setup/common.sh@80 -- # echo 1600321314816 00:05:07.286 10:52:05 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:07.286 10:52:05 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:07.286 10:52:05 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:07.286 10:52:05 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:07.286 10:52:05 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:07.286 10:52:05 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:07.286 10:52:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:07.286 10:52:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:07.286 10:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:07.286 ************************************ 00:05:07.286 START TEST nvme_mount 00:05:07.286 ************************************ 00:05:07.286 10:52:05 -- common/autotest_common.sh@1114 -- # nvme_mount 00:05:07.286 10:52:05 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:07.286 10:52:05 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:07.286 10:52:05 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:07.286 10:52:05 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:07.286 10:52:05 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:07.286 10:52:05 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:07.286 10:52:05 -- setup/common.sh@40 -- # local part_no=1 00:05:07.286 10:52:05 -- setup/common.sh@41 -- # local size=1073741824 00:05:07.286 10:52:05 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:07.286 10:52:05 -- setup/common.sh@44 -- # parts=() 00:05:07.286 10:52:05 -- setup/common.sh@44 -- # local parts 00:05:07.286 10:52:05 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:07.286 10:52:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.286 10:52:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:07.286 10:52:05 -- setup/common.sh@46 -- # (( part++ )) 00:05:07.286 10:52:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.286 10:52:05 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:07.286 10:52:05 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:07.286 10:52:05 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:08.664 Creating new GPT entries in memory. 00:05:08.664 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:08.664 other utilities. 00:05:08.664 10:52:06 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:08.664 10:52:06 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:08.664 10:52:06 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:08.664 10:52:06 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:08.664 10:52:06 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:09.601 Creating new GPT entries in memory. 00:05:09.601 The operation has completed successfully. 00:05:09.601 10:52:07 -- setup/common.sh@57 -- # (( part++ )) 00:05:09.601 10:52:07 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:09.601 10:52:07 -- setup/common.sh@62 -- # wait 611414 00:05:09.601 10:52:07 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.601 10:52:07 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:09.601 10:52:07 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.601 10:52:07 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:09.601 10:52:07 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:09.601 10:52:07 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.601 10:52:08 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.601 10:52:08 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:09.601 10:52:08 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:09.601 10:52:08 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.601 10:52:08 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.601 10:52:08 -- setup/devices.sh@53 -- # local found=0 00:05:09.601 10:52:08 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:09.601 10:52:08 -- setup/devices.sh@56 -- # : 00:05:09.601 10:52:08 -- setup/devices.sh@59 -- # local pci status 00:05:09.601 10:52:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.601 10:52:08 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:09.601 10:52:08 -- setup/devices.sh@47 -- # setup output config 00:05:09.601 10:52:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.601 10:52:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:12.888 10:52:11 -- setup/devices.sh@63 -- # found=1 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.888 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.888 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.889 10:52:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:12.889 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.889 10:52:11 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:12.889 10:52:11 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:12.889 10:52:11 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:12.889 10:52:11 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:12.889 10:52:11 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:12.889 10:52:11 -- setup/devices.sh@110 -- # cleanup_nvme 00:05:12.889 10:52:11 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:12.889 10:52:11 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:12.889 10:52:11 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:12.889 10:52:11 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:12.889 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:12.889 10:52:11 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:12.889 10:52:11 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:13.148 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:13.148 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:13.148 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:13.148 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:13.148 10:52:11 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:13.148 10:52:11 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:13.148 10:52:11 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.148 10:52:11 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:13.148 10:52:11 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:13.148 10:52:11 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.148 10:52:11 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.148 10:52:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:13.148 10:52:11 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:13.148 10:52:11 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.148 10:52:11 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.148 10:52:11 -- setup/devices.sh@53 -- # local found=0 00:05:13.148 10:52:11 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:13.148 10:52:11 -- setup/devices.sh@56 -- # : 00:05:13.148 10:52:11 -- setup/devices.sh@59 -- # local pci status 00:05:13.148 10:52:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.148 10:52:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:13.148 10:52:11 -- setup/devices.sh@47 -- # setup output config 00:05:13.148 10:52:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:13.148 10:52:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:16.444 10:52:14 -- setup/devices.sh@63 -- # found=1 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:16.444 10:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.444 10:52:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:16.444 10:52:14 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:16.444 10:52:14 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:16.444 10:52:14 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:16.445 10:52:14 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:16.445 10:52:14 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:16.445 10:52:15 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:16.445 10:52:15 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:16.445 10:52:15 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:16.445 10:52:15 -- setup/devices.sh@50 -- # local mount_point= 00:05:16.445 10:52:15 -- setup/devices.sh@51 -- # local test_file= 00:05:16.445 10:52:15 -- setup/devices.sh@53 -- # local found=0 00:05:16.445 10:52:15 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:16.445 10:52:15 -- setup/devices.sh@59 -- # local pci status 00:05:16.445 10:52:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.445 10:52:15 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:16.445 10:52:15 -- setup/devices.sh@47 -- # setup output config 00:05:16.445 10:52:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:16.445 10:52:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:19.746 10:52:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:17 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:19.746 10:52:17 -- setup/devices.sh@63 -- # found=1 00:05:19.746 10:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:19.746 10:52:18 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:19.746 10:52:18 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:19.746 10:52:18 -- setup/devices.sh@68 -- # return 0 00:05:19.746 10:52:18 -- setup/devices.sh@128 -- # cleanup_nvme 00:05:19.746 10:52:18 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:19.746 10:52:18 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:19.746 10:52:18 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:19.746 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:19.746 00:05:19.746 real 0m12.392s 00:05:19.746 user 0m3.692s 00:05:19.746 sys 0m6.674s 00:05:19.746 10:52:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:19.746 10:52:18 -- common/autotest_common.sh@10 -- # set +x 00:05:19.746 ************************************ 00:05:19.746 END TEST nvme_mount 00:05:19.746 ************************************ 00:05:19.746 10:52:18 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:19.746 10:52:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:19.746 10:52:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.746 10:52:18 -- common/autotest_common.sh@10 -- # set +x 00:05:19.746 ************************************ 00:05:19.746 START TEST dm_mount 00:05:19.746 ************************************ 00:05:19.746 10:52:18 -- common/autotest_common.sh@1114 -- # dm_mount 00:05:19.746 10:52:18 -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:19.746 10:52:18 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:19.746 10:52:18 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:19.746 10:52:18 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:19.746 10:52:18 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:19.746 10:52:18 -- setup/common.sh@40 -- # local part_no=2 00:05:19.746 10:52:18 -- setup/common.sh@41 -- # local size=1073741824 00:05:19.746 10:52:18 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:19.746 10:52:18 -- setup/common.sh@44 -- # parts=() 00:05:19.746 10:52:18 -- setup/common.sh@44 -- # local parts 00:05:19.746 10:52:18 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:19.746 10:52:18 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:19.746 10:52:18 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:19.746 10:52:18 -- setup/common.sh@46 -- # (( part++ )) 00:05:19.746 10:52:18 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:19.746 10:52:18 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:19.746 10:52:18 -- setup/common.sh@46 -- # (( part++ )) 00:05:19.746 10:52:18 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:19.746 10:52:18 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:19.746 10:52:18 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:19.746 10:52:18 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:20.686 Creating new GPT entries in memory. 00:05:20.686 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:20.686 other utilities. 00:05:20.686 10:52:19 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:20.686 10:52:19 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:20.686 10:52:19 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:20.686 10:52:19 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:20.686 10:52:19 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:22.066 Creating new GPT entries in memory. 00:05:22.066 The operation has completed successfully. 00:05:22.066 10:52:20 -- setup/common.sh@57 -- # (( part++ )) 00:05:22.066 10:52:20 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:22.066 10:52:20 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:22.066 10:52:20 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:22.066 10:52:20 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:23.004 The operation has completed successfully. 00:05:23.004 10:52:21 -- setup/common.sh@57 -- # (( part++ )) 00:05:23.004 10:52:21 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:23.004 10:52:21 -- setup/common.sh@62 -- # wait 615914 00:05:23.004 10:52:21 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:23.004 10:52:21 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.004 10:52:21 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:23.004 10:52:21 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:23.004 10:52:21 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:23.004 10:52:21 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:23.004 10:52:21 -- setup/devices.sh@161 -- # break 00:05:23.004 10:52:21 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:23.004 10:52:21 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:23.004 10:52:21 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:23.004 10:52:21 -- setup/devices.sh@166 -- # dm=dm-0 00:05:23.004 10:52:21 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:23.004 10:52:21 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:23.004 10:52:21 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.004 10:52:21 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:23.004 10:52:21 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.004 10:52:21 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:23.004 10:52:21 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:23.004 10:52:21 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.004 10:52:21 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:23.004 10:52:21 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:23.004 10:52:21 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:23.004 10:52:21 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:23.004 10:52:21 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:23.004 10:52:21 -- setup/devices.sh@53 -- # local found=0 00:05:23.004 10:52:21 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:23.004 10:52:21 -- setup/devices.sh@56 -- # : 00:05:23.004 10:52:21 -- setup/devices.sh@59 -- # local pci status 00:05:23.004 10:52:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.004 10:52:21 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:23.004 10:52:21 -- setup/devices.sh@47 -- # setup output config 00:05:23.004 10:52:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:23.004 10:52:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:26.296 10:52:24 -- setup/devices.sh@63 -- # found=1 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:26.296 10:52:24 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:26.296 10:52:24 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:26.296 10:52:24 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:26.296 10:52:24 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:26.296 10:52:24 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:26.296 10:52:24 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:26.296 10:52:24 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:26.296 10:52:24 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:26.296 10:52:24 -- setup/devices.sh@50 -- # local mount_point= 00:05:26.296 10:52:24 -- setup/devices.sh@51 -- # local test_file= 00:05:26.296 10:52:24 -- setup/devices.sh@53 -- # local found=0 00:05:26.296 10:52:24 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:26.296 10:52:24 -- setup/devices.sh@59 -- # local pci status 00:05:26.296 10:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:26.296 10:52:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:26.296 10:52:24 -- setup/devices.sh@47 -- # setup output config 00:05:26.296 10:52:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:26.296 10:52:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:29.590 10:52:27 -- setup/devices.sh@63 -- # found=1 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.590 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.590 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:29.591 10:52:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:29.591 10:52:27 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:29.591 10:52:27 -- setup/devices.sh@68 -- # return 0 00:05:29.591 10:52:27 -- setup/devices.sh@187 -- # cleanup_dm 00:05:29.591 10:52:27 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:29.591 10:52:27 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:29.591 10:52:27 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:29.591 10:52:27 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:29.591 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:29.591 10:52:27 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:29.591 00:05:29.591 real 0m9.535s 00:05:29.591 user 0m2.210s 00:05:29.591 sys 0m4.233s 00:05:29.591 10:52:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.591 10:52:27 -- common/autotest_common.sh@10 -- # set +x 00:05:29.591 ************************************ 00:05:29.591 END TEST dm_mount 00:05:29.591 ************************************ 00:05:29.591 10:52:27 -- setup/devices.sh@1 -- # cleanup 00:05:29.591 10:52:27 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:29.591 10:52:27 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:29.591 10:52:27 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:29.591 10:52:27 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:29.591 10:52:27 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:29.591 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:29.591 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:29.591 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:29.591 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:29.591 10:52:28 -- setup/devices.sh@12 -- # cleanup_dm 00:05:29.591 10:52:28 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:29.591 10:52:28 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:29.591 10:52:28 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:29.591 10:52:28 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:29.591 10:52:28 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:29.591 10:52:28 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:29.591 00:05:29.591 real 0m26.403s 00:05:29.591 user 0m7.494s 00:05:29.591 sys 0m13.722s 00:05:29.591 10:52:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.591 10:52:28 -- common/autotest_common.sh@10 -- # set +x 00:05:29.591 ************************************ 00:05:29.591 END TEST devices 00:05:29.591 ************************************ 00:05:29.591 00:05:29.591 real 1m32.968s 00:05:29.591 user 0m28.979s 00:05:29.591 sys 0m52.908s 00:05:29.591 10:52:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.591 10:52:28 -- common/autotest_common.sh@10 -- # set +x 00:05:29.591 ************************************ 00:05:29.591 END TEST setup.sh 00:05:29.591 ************************************ 00:05:29.850 10:52:28 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:33.141 Hugepages 00:05:33.141 node hugesize free / total 00:05:33.141 node0 1048576kB 0 / 0 00:05:33.141 node0 2048kB 2048 / 2048 00:05:33.141 node1 1048576kB 0 / 0 00:05:33.141 node1 2048kB 0 / 0 00:05:33.141 00:05:33.141 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:33.141 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:33.141 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:33.141 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:33.141 10:52:31 -- spdk/autotest.sh@128 -- # uname -s 00:05:33.141 10:52:31 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:33.141 10:52:31 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:33.141 10:52:31 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:35.678 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:35.678 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:37.061 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:37.321 10:52:35 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:38.259 10:52:36 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:38.259 10:52:36 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:38.259 10:52:36 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:38.259 10:52:36 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:38.259 10:52:36 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:38.259 10:52:36 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:38.259 10:52:36 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:38.260 10:52:36 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:38.260 10:52:36 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:38.519 10:52:36 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:38.519 10:52:36 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:38.519 10:52:36 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:41.812 Waiting for block devices as requested 00:05:41.812 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:41.812 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:41.812 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:41.812 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:41.812 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:41.812 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:41.812 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:42.072 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:42.072 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:42.072 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:42.331 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:42.331 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:42.331 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:42.590 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:42.590 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:42.590 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:42.850 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:42.850 10:52:41 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:42.850 10:52:41 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:42.850 10:52:41 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:42.850 10:52:41 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:42.850 10:52:41 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:42.850 10:52:41 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:42.850 10:52:41 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:42.850 10:52:41 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:42.850 10:52:41 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:42.850 10:52:41 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:42.850 10:52:41 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:42.850 10:52:41 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:42.850 10:52:41 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:42.850 10:52:41 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:42.850 10:52:41 -- common/autotest_common.sh@1552 -- # continue 00:05:42.850 10:52:41 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:42.850 10:52:41 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:42.850 10:52:41 -- common/autotest_common.sh@10 -- # set +x 00:05:43.109 10:52:41 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:43.109 10:52:41 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:43.109 10:52:41 -- common/autotest_common.sh@10 -- # set +x 00:05:43.109 10:52:41 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:46.402 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:46.402 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:46.403 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:46.403 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:46.403 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:48.314 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:48.314 10:52:46 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:48.314 10:52:46 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:48.314 10:52:46 -- common/autotest_common.sh@10 -- # set +x 00:05:48.314 10:52:46 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:48.314 10:52:46 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:48.314 10:52:46 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:48.314 10:52:46 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:48.314 10:52:46 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:48.314 10:52:46 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:48.314 10:52:46 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:48.314 10:52:46 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:48.314 10:52:46 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:48.314 10:52:46 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:48.314 10:52:46 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:48.314 10:52:46 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:48.314 10:52:46 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:48.314 10:52:46 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:48.314 10:52:46 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:48.314 10:52:46 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:48.314 10:52:46 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:48.314 10:52:46 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:48.314 10:52:46 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:48.314 10:52:46 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:48.314 10:52:46 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=625549 00:05:48.314 10:52:46 -- common/autotest_common.sh@1593 -- # waitforlisten 625549 00:05:48.314 10:52:46 -- common/autotest_common.sh@829 -- # '[' -z 625549 ']' 00:05:48.314 10:52:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.314 10:52:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.314 10:52:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.314 10:52:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.314 10:52:46 -- common/autotest_common.sh@10 -- # set +x 00:05:48.314 10:52:46 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.315 [2024-12-16 10:52:46.692789] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:48.315 [2024-12-16 10:52:46.692860] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid625549 ] 00:05:48.315 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.315 [2024-12-16 10:52:46.760635] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.315 [2024-12-16 10:52:46.797992] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.315 [2024-12-16 10:52:46.798100] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.883 10:52:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:48.883 10:52:47 -- common/autotest_common.sh@862 -- # return 0 00:05:48.883 10:52:47 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:48.883 10:52:47 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:48.883 10:52:47 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:52.175 nvme0n1 00:05:52.175 10:52:50 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:52.175 [2024-12-16 10:52:50.632377] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:52.175 request: 00:05:52.175 { 00:05:52.175 "nvme_ctrlr_name": "nvme0", 00:05:52.175 "password": "test", 00:05:52.175 "method": "bdev_nvme_opal_revert", 00:05:52.175 "req_id": 1 00:05:52.175 } 00:05:52.175 Got JSON-RPC error response 00:05:52.175 response: 00:05:52.175 { 00:05:52.175 "code": -32602, 00:05:52.175 "message": "Invalid parameters" 00:05:52.175 } 00:05:52.175 10:52:50 -- common/autotest_common.sh@1599 -- # true 00:05:52.175 10:52:50 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:52.175 10:52:50 -- common/autotest_common.sh@1603 -- # killprocess 625549 00:05:52.175 10:52:50 -- common/autotest_common.sh@936 -- # '[' -z 625549 ']' 00:05:52.175 10:52:50 -- common/autotest_common.sh@940 -- # kill -0 625549 00:05:52.175 10:52:50 -- common/autotest_common.sh@941 -- # uname 00:05:52.175 10:52:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:52.175 10:52:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 625549 00:05:52.175 10:52:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:52.175 10:52:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:52.175 10:52:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 625549' 00:05:52.175 killing process with pid 625549 00:05:52.175 10:52:50 -- common/autotest_common.sh@955 -- # kill 625549 00:05:52.175 10:52:50 -- common/autotest_common.sh@960 -- # wait 625549 00:05:54.713 10:52:52 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:54.713 10:52:52 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:54.713 10:52:52 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:54.713 10:52:52 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:54.713 10:52:52 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:54.713 10:52:52 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:54.713 10:52:52 -- common/autotest_common.sh@10 -- # set +x 00:05:54.713 10:52:52 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:54.713 10:52:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:54.713 10:52:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.713 10:52:52 -- common/autotest_common.sh@10 -- # set +x 00:05:54.713 ************************************ 00:05:54.713 START TEST env 00:05:54.713 ************************************ 00:05:54.713 10:52:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:54.713 * Looking for test storage... 00:05:54.713 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:54.713 10:52:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:54.713 10:52:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:54.713 10:52:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:54.713 10:52:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:54.713 10:52:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:54.713 10:52:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:54.713 10:52:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:54.713 10:52:53 -- scripts/common.sh@335 -- # IFS=.-: 00:05:54.713 10:52:53 -- scripts/common.sh@335 -- # read -ra ver1 00:05:54.713 10:52:53 -- scripts/common.sh@336 -- # IFS=.-: 00:05:54.713 10:52:53 -- scripts/common.sh@336 -- # read -ra ver2 00:05:54.713 10:52:53 -- scripts/common.sh@337 -- # local 'op=<' 00:05:54.713 10:52:53 -- scripts/common.sh@339 -- # ver1_l=2 00:05:54.713 10:52:53 -- scripts/common.sh@340 -- # ver2_l=1 00:05:54.713 10:52:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:54.713 10:52:53 -- scripts/common.sh@343 -- # case "$op" in 00:05:54.713 10:52:53 -- scripts/common.sh@344 -- # : 1 00:05:54.713 10:52:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:54.713 10:52:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:54.713 10:52:53 -- scripts/common.sh@364 -- # decimal 1 00:05:54.713 10:52:53 -- scripts/common.sh@352 -- # local d=1 00:05:54.713 10:52:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:54.713 10:52:53 -- scripts/common.sh@354 -- # echo 1 00:05:54.713 10:52:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:54.713 10:52:53 -- scripts/common.sh@365 -- # decimal 2 00:05:54.713 10:52:53 -- scripts/common.sh@352 -- # local d=2 00:05:54.713 10:52:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:54.713 10:52:53 -- scripts/common.sh@354 -- # echo 2 00:05:54.713 10:52:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:54.713 10:52:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:54.713 10:52:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:54.713 10:52:53 -- scripts/common.sh@367 -- # return 0 00:05:54.713 10:52:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:54.713 10:52:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:54.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.713 --rc genhtml_branch_coverage=1 00:05:54.713 --rc genhtml_function_coverage=1 00:05:54.713 --rc genhtml_legend=1 00:05:54.713 --rc geninfo_all_blocks=1 00:05:54.713 --rc geninfo_unexecuted_blocks=1 00:05:54.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.713 ' 00:05:54.713 10:52:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:54.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.713 --rc genhtml_branch_coverage=1 00:05:54.713 --rc genhtml_function_coverage=1 00:05:54.713 --rc genhtml_legend=1 00:05:54.713 --rc geninfo_all_blocks=1 00:05:54.713 --rc geninfo_unexecuted_blocks=1 00:05:54.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.713 ' 00:05:54.713 10:52:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:54.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.713 --rc genhtml_branch_coverage=1 00:05:54.713 --rc genhtml_function_coverage=1 00:05:54.713 --rc genhtml_legend=1 00:05:54.713 --rc geninfo_all_blocks=1 00:05:54.713 --rc geninfo_unexecuted_blocks=1 00:05:54.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.713 ' 00:05:54.713 10:52:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:54.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:54.713 --rc genhtml_branch_coverage=1 00:05:54.713 --rc genhtml_function_coverage=1 00:05:54.713 --rc genhtml_legend=1 00:05:54.713 --rc geninfo_all_blocks=1 00:05:54.713 --rc geninfo_unexecuted_blocks=1 00:05:54.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:54.713 ' 00:05:54.713 10:52:53 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:54.713 10:52:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:54.713 10:52:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.713 10:52:53 -- common/autotest_common.sh@10 -- # set +x 00:05:54.713 ************************************ 00:05:54.713 START TEST env_memory 00:05:54.713 ************************************ 00:05:54.713 10:52:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:54.713 00:05:54.713 00:05:54.713 CUnit - A unit testing framework for C - Version 2.1-3 00:05:54.713 http://cunit.sourceforge.net/ 00:05:54.713 00:05:54.713 00:05:54.713 Suite: memory 00:05:54.713 Test: alloc and free memory map ...[2024-12-16 10:52:53.123983] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:54.713 passed 00:05:54.713 Test: mem map translation ...[2024-12-16 10:52:53.136578] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:54.713 [2024-12-16 10:52:53.136595] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:54.713 [2024-12-16 10:52:53.136631] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:54.713 [2024-12-16 10:52:53.136640] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:54.713 passed 00:05:54.713 Test: mem map registration ...[2024-12-16 10:52:53.156750] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:54.713 [2024-12-16 10:52:53.156766] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:54.713 passed 00:05:54.713 Test: mem map adjacent registrations ...passed 00:05:54.713 00:05:54.713 Run Summary: Type Total Ran Passed Failed Inactive 00:05:54.713 suites 1 1 n/a 0 0 00:05:54.713 tests 4 4 4 0 0 00:05:54.713 asserts 152 152 152 0 n/a 00:05:54.713 00:05:54.713 Elapsed time = 0.082 seconds 00:05:54.713 00:05:54.713 real 0m0.095s 00:05:54.713 user 0m0.083s 00:05:54.713 sys 0m0.012s 00:05:54.713 10:52:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.714 10:52:53 -- common/autotest_common.sh@10 -- # set +x 00:05:54.714 ************************************ 00:05:54.714 END TEST env_memory 00:05:54.714 ************************************ 00:05:54.714 10:52:53 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:54.714 10:52:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:54.714 10:52:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.714 10:52:53 -- common/autotest_common.sh@10 -- # set +x 00:05:54.714 ************************************ 00:05:54.714 START TEST env_vtophys 00:05:54.714 ************************************ 00:05:54.714 10:52:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:54.714 EAL: lib.eal log level changed from notice to debug 00:05:54.714 EAL: Detected lcore 0 as core 0 on socket 0 00:05:54.714 EAL: Detected lcore 1 as core 1 on socket 0 00:05:54.714 EAL: Detected lcore 2 as core 2 on socket 0 00:05:54.714 EAL: Detected lcore 3 as core 3 on socket 0 00:05:54.714 EAL: Detected lcore 4 as core 4 on socket 0 00:05:54.714 EAL: Detected lcore 5 as core 5 on socket 0 00:05:54.714 EAL: Detected lcore 6 as core 6 on socket 0 00:05:54.714 EAL: Detected lcore 7 as core 8 on socket 0 00:05:54.714 EAL: Detected lcore 8 as core 9 on socket 0 00:05:54.714 EAL: Detected lcore 9 as core 10 on socket 0 00:05:54.714 EAL: Detected lcore 10 as core 11 on socket 0 00:05:54.714 EAL: Detected lcore 11 as core 12 on socket 0 00:05:54.714 EAL: Detected lcore 12 as core 13 on socket 0 00:05:54.714 EAL: Detected lcore 13 as core 14 on socket 0 00:05:54.714 EAL: Detected lcore 14 as core 16 on socket 0 00:05:54.714 EAL: Detected lcore 15 as core 17 on socket 0 00:05:54.714 EAL: Detected lcore 16 as core 18 on socket 0 00:05:54.714 EAL: Detected lcore 17 as core 19 on socket 0 00:05:54.714 EAL: Detected lcore 18 as core 20 on socket 0 00:05:54.714 EAL: Detected lcore 19 as core 21 on socket 0 00:05:54.714 EAL: Detected lcore 20 as core 22 on socket 0 00:05:54.714 EAL: Detected lcore 21 as core 24 on socket 0 00:05:54.714 EAL: Detected lcore 22 as core 25 on socket 0 00:05:54.714 EAL: Detected lcore 23 as core 26 on socket 0 00:05:54.714 EAL: Detected lcore 24 as core 27 on socket 0 00:05:54.714 EAL: Detected lcore 25 as core 28 on socket 0 00:05:54.714 EAL: Detected lcore 26 as core 29 on socket 0 00:05:54.714 EAL: Detected lcore 27 as core 30 on socket 0 00:05:54.714 EAL: Detected lcore 28 as core 0 on socket 1 00:05:54.714 EAL: Detected lcore 29 as core 1 on socket 1 00:05:54.714 EAL: Detected lcore 30 as core 2 on socket 1 00:05:54.714 EAL: Detected lcore 31 as core 3 on socket 1 00:05:54.714 EAL: Detected lcore 32 as core 4 on socket 1 00:05:54.714 EAL: Detected lcore 33 as core 5 on socket 1 00:05:54.714 EAL: Detected lcore 34 as core 6 on socket 1 00:05:54.714 EAL: Detected lcore 35 as core 8 on socket 1 00:05:54.714 EAL: Detected lcore 36 as core 9 on socket 1 00:05:54.714 EAL: Detected lcore 37 as core 10 on socket 1 00:05:54.714 EAL: Detected lcore 38 as core 11 on socket 1 00:05:54.714 EAL: Detected lcore 39 as core 12 on socket 1 00:05:54.714 EAL: Detected lcore 40 as core 13 on socket 1 00:05:54.714 EAL: Detected lcore 41 as core 14 on socket 1 00:05:54.714 EAL: Detected lcore 42 as core 16 on socket 1 00:05:54.714 EAL: Detected lcore 43 as core 17 on socket 1 00:05:54.714 EAL: Detected lcore 44 as core 18 on socket 1 00:05:54.714 EAL: Detected lcore 45 as core 19 on socket 1 00:05:54.714 EAL: Detected lcore 46 as core 20 on socket 1 00:05:54.714 EAL: Detected lcore 47 as core 21 on socket 1 00:05:54.714 EAL: Detected lcore 48 as core 22 on socket 1 00:05:54.714 EAL: Detected lcore 49 as core 24 on socket 1 00:05:54.714 EAL: Detected lcore 50 as core 25 on socket 1 00:05:54.714 EAL: Detected lcore 51 as core 26 on socket 1 00:05:54.714 EAL: Detected lcore 52 as core 27 on socket 1 00:05:54.714 EAL: Detected lcore 53 as core 28 on socket 1 00:05:54.714 EAL: Detected lcore 54 as core 29 on socket 1 00:05:54.714 EAL: Detected lcore 55 as core 30 on socket 1 00:05:54.714 EAL: Detected lcore 56 as core 0 on socket 0 00:05:54.714 EAL: Detected lcore 57 as core 1 on socket 0 00:05:54.714 EAL: Detected lcore 58 as core 2 on socket 0 00:05:54.714 EAL: Detected lcore 59 as core 3 on socket 0 00:05:54.714 EAL: Detected lcore 60 as core 4 on socket 0 00:05:54.714 EAL: Detected lcore 61 as core 5 on socket 0 00:05:54.714 EAL: Detected lcore 62 as core 6 on socket 0 00:05:54.714 EAL: Detected lcore 63 as core 8 on socket 0 00:05:54.714 EAL: Detected lcore 64 as core 9 on socket 0 00:05:54.714 EAL: Detected lcore 65 as core 10 on socket 0 00:05:54.714 EAL: Detected lcore 66 as core 11 on socket 0 00:05:54.714 EAL: Detected lcore 67 as core 12 on socket 0 00:05:54.714 EAL: Detected lcore 68 as core 13 on socket 0 00:05:54.714 EAL: Detected lcore 69 as core 14 on socket 0 00:05:54.714 EAL: Detected lcore 70 as core 16 on socket 0 00:05:54.714 EAL: Detected lcore 71 as core 17 on socket 0 00:05:54.714 EAL: Detected lcore 72 as core 18 on socket 0 00:05:54.714 EAL: Detected lcore 73 as core 19 on socket 0 00:05:54.714 EAL: Detected lcore 74 as core 20 on socket 0 00:05:54.714 EAL: Detected lcore 75 as core 21 on socket 0 00:05:54.714 EAL: Detected lcore 76 as core 22 on socket 0 00:05:54.714 EAL: Detected lcore 77 as core 24 on socket 0 00:05:54.714 EAL: Detected lcore 78 as core 25 on socket 0 00:05:54.714 EAL: Detected lcore 79 as core 26 on socket 0 00:05:54.714 EAL: Detected lcore 80 as core 27 on socket 0 00:05:54.714 EAL: Detected lcore 81 as core 28 on socket 0 00:05:54.714 EAL: Detected lcore 82 as core 29 on socket 0 00:05:54.714 EAL: Detected lcore 83 as core 30 on socket 0 00:05:54.714 EAL: Detected lcore 84 as core 0 on socket 1 00:05:54.714 EAL: Detected lcore 85 as core 1 on socket 1 00:05:54.714 EAL: Detected lcore 86 as core 2 on socket 1 00:05:54.714 EAL: Detected lcore 87 as core 3 on socket 1 00:05:54.714 EAL: Detected lcore 88 as core 4 on socket 1 00:05:54.714 EAL: Detected lcore 89 as core 5 on socket 1 00:05:54.714 EAL: Detected lcore 90 as core 6 on socket 1 00:05:54.714 EAL: Detected lcore 91 as core 8 on socket 1 00:05:54.714 EAL: Detected lcore 92 as core 9 on socket 1 00:05:54.714 EAL: Detected lcore 93 as core 10 on socket 1 00:05:54.714 EAL: Detected lcore 94 as core 11 on socket 1 00:05:54.714 EAL: Detected lcore 95 as core 12 on socket 1 00:05:54.714 EAL: Detected lcore 96 as core 13 on socket 1 00:05:54.714 EAL: Detected lcore 97 as core 14 on socket 1 00:05:54.714 EAL: Detected lcore 98 as core 16 on socket 1 00:05:54.714 EAL: Detected lcore 99 as core 17 on socket 1 00:05:54.714 EAL: Detected lcore 100 as core 18 on socket 1 00:05:54.714 EAL: Detected lcore 101 as core 19 on socket 1 00:05:54.714 EAL: Detected lcore 102 as core 20 on socket 1 00:05:54.714 EAL: Detected lcore 103 as core 21 on socket 1 00:05:54.714 EAL: Detected lcore 104 as core 22 on socket 1 00:05:54.714 EAL: Detected lcore 105 as core 24 on socket 1 00:05:54.714 EAL: Detected lcore 106 as core 25 on socket 1 00:05:54.714 EAL: Detected lcore 107 as core 26 on socket 1 00:05:54.714 EAL: Detected lcore 108 as core 27 on socket 1 00:05:54.714 EAL: Detected lcore 109 as core 28 on socket 1 00:05:54.714 EAL: Detected lcore 110 as core 29 on socket 1 00:05:54.714 EAL: Detected lcore 111 as core 30 on socket 1 00:05:54.714 EAL: Maximum logical cores by configuration: 128 00:05:54.714 EAL: Detected CPU lcores: 112 00:05:54.714 EAL: Detected NUMA nodes: 2 00:05:54.714 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:54.714 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:54.714 EAL: Checking presence of .so 'librte_eal.so' 00:05:54.714 EAL: Detected static linkage of DPDK 00:05:54.714 EAL: No shared files mode enabled, IPC will be disabled 00:05:54.714 EAL: Bus pci wants IOVA as 'DC' 00:05:54.714 EAL: Buses did not request a specific IOVA mode. 00:05:54.714 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:54.714 EAL: Selected IOVA mode 'VA' 00:05:54.714 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.714 EAL: Probing VFIO support... 00:05:54.714 EAL: IOMMU type 1 (Type 1) is supported 00:05:54.714 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:54.714 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:54.714 EAL: VFIO support initialized 00:05:54.714 EAL: Ask a virtual area of 0x2e000 bytes 00:05:54.714 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:54.714 EAL: Setting up physically contiguous memory... 00:05:54.714 EAL: Setting maximum number of open files to 524288 00:05:54.714 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:54.714 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:54.714 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.714 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:54.714 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.714 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:54.714 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.714 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:54.714 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.714 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:54.714 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:54.714 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.714 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:54.714 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.714 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:54.714 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:54.714 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.714 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:54.714 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:54.714 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.715 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:54.715 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:54.715 EAL: Ask a virtual area of 0x61000 bytes 00:05:54.715 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:54.715 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:54.715 EAL: Ask a virtual area of 0x400000000 bytes 00:05:54.715 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:54.715 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:54.715 EAL: Hugepages will be freed exactly as allocated. 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: TSC frequency is ~2500000 KHz 00:05:54.715 EAL: Main lcore 0 is ready (tid=7fe25d9cea00;cpuset=[0]) 00:05:54.715 EAL: Trying to obtain current memory policy. 00:05:54.715 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.715 EAL: Restoring previous memory policy: 0 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was expanded by 2MB 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Mem event callback 'spdk:(nil)' registered 00:05:54.715 00:05:54.715 00:05:54.715 CUnit - A unit testing framework for C - Version 2.1-3 00:05:54.715 http://cunit.sourceforge.net/ 00:05:54.715 00:05:54.715 00:05:54.715 Suite: components_suite 00:05:54.715 Test: vtophys_malloc_test ...passed 00:05:54.715 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:54.715 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.715 EAL: Restoring previous memory policy: 4 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was expanded by 4MB 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was shrunk by 4MB 00:05:54.715 EAL: Trying to obtain current memory policy. 00:05:54.715 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.715 EAL: Restoring previous memory policy: 4 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was expanded by 6MB 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was shrunk by 6MB 00:05:54.715 EAL: Trying to obtain current memory policy. 00:05:54.715 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.715 EAL: Restoring previous memory policy: 4 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was expanded by 10MB 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was shrunk by 10MB 00:05:54.715 EAL: Trying to obtain current memory policy. 00:05:54.715 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.715 EAL: Restoring previous memory policy: 4 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was expanded by 18MB 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was shrunk by 18MB 00:05:54.715 EAL: Trying to obtain current memory policy. 00:05:54.715 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.715 EAL: Restoring previous memory policy: 4 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.715 EAL: request: mp_malloc_sync 00:05:54.715 EAL: No shared files mode enabled, IPC is disabled 00:05:54.715 EAL: Heap on socket 0 was expanded by 34MB 00:05:54.715 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was shrunk by 34MB 00:05:54.975 EAL: Trying to obtain current memory policy. 00:05:54.975 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.975 EAL: Restoring previous memory policy: 4 00:05:54.975 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was expanded by 66MB 00:05:54.975 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was shrunk by 66MB 00:05:54.975 EAL: Trying to obtain current memory policy. 00:05:54.975 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.975 EAL: Restoring previous memory policy: 4 00:05:54.975 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was expanded by 130MB 00:05:54.975 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was shrunk by 130MB 00:05:54.975 EAL: Trying to obtain current memory policy. 00:05:54.975 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.975 EAL: Restoring previous memory policy: 4 00:05:54.975 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was expanded by 258MB 00:05:54.975 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.975 EAL: request: mp_malloc_sync 00:05:54.975 EAL: No shared files mode enabled, IPC is disabled 00:05:54.975 EAL: Heap on socket 0 was shrunk by 258MB 00:05:54.975 EAL: Trying to obtain current memory policy. 00:05:54.975 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:55.234 EAL: Restoring previous memory policy: 4 00:05:55.234 EAL: Calling mem event callback 'spdk:(nil)' 00:05:55.234 EAL: request: mp_malloc_sync 00:05:55.234 EAL: No shared files mode enabled, IPC is disabled 00:05:55.234 EAL: Heap on socket 0 was expanded by 514MB 00:05:55.234 EAL: Calling mem event callback 'spdk:(nil)' 00:05:55.234 EAL: request: mp_malloc_sync 00:05:55.234 EAL: No shared files mode enabled, IPC is disabled 00:05:55.234 EAL: Heap on socket 0 was shrunk by 514MB 00:05:55.234 EAL: Trying to obtain current memory policy. 00:05:55.234 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:55.493 EAL: Restoring previous memory policy: 4 00:05:55.493 EAL: Calling mem event callback 'spdk:(nil)' 00:05:55.493 EAL: request: mp_malloc_sync 00:05:55.493 EAL: No shared files mode enabled, IPC is disabled 00:05:55.493 EAL: Heap on socket 0 was expanded by 1026MB 00:05:55.753 EAL: Calling mem event callback 'spdk:(nil)' 00:05:55.753 EAL: request: mp_malloc_sync 00:05:55.753 EAL: No shared files mode enabled, IPC is disabled 00:05:55.753 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:55.753 passed 00:05:55.753 00:05:55.753 Run Summary: Type Total Ran Passed Failed Inactive 00:05:55.753 suites 1 1 n/a 0 0 00:05:55.753 tests 2 2 2 0 0 00:05:55.753 asserts 497 497 497 0 n/a 00:05:55.753 00:05:55.753 Elapsed time = 0.960 seconds 00:05:55.753 EAL: Calling mem event callback 'spdk:(nil)' 00:05:55.753 EAL: request: mp_malloc_sync 00:05:55.753 EAL: No shared files mode enabled, IPC is disabled 00:05:55.753 EAL: Heap on socket 0 was shrunk by 2MB 00:05:55.753 EAL: No shared files mode enabled, IPC is disabled 00:05:55.753 EAL: No shared files mode enabled, IPC is disabled 00:05:55.753 EAL: No shared files mode enabled, IPC is disabled 00:05:55.753 00:05:55.753 real 0m1.073s 00:05:55.753 user 0m0.621s 00:05:55.753 sys 0m0.428s 00:05:55.753 10:52:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.753 10:52:54 -- common/autotest_common.sh@10 -- # set +x 00:05:55.753 ************************************ 00:05:55.753 END TEST env_vtophys 00:05:55.753 ************************************ 00:05:55.753 10:52:54 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:55.753 10:52:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:55.753 10:52:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:55.753 10:52:54 -- common/autotest_common.sh@10 -- # set +x 00:05:55.753 ************************************ 00:05:55.753 START TEST env_pci 00:05:55.753 ************************************ 00:05:55.753 10:52:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:55.753 00:05:55.753 00:05:55.753 CUnit - A unit testing framework for C - Version 2.1-3 00:05:55.753 http://cunit.sourceforge.net/ 00:05:55.753 00:05:55.753 00:05:55.753 Suite: pci 00:05:55.753 Test: pci_hook ...[2024-12-16 10:52:54.374901] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 627029 has claimed it 00:05:56.013 EAL: Cannot find device (10000:00:01.0) 00:05:56.013 EAL: Failed to attach device on primary process 00:05:56.013 passed 00:05:56.013 00:05:56.013 Run Summary: Type Total Ran Passed Failed Inactive 00:05:56.013 suites 1 1 n/a 0 0 00:05:56.013 tests 1 1 1 0 0 00:05:56.013 asserts 25 25 25 0 n/a 00:05:56.013 00:05:56.013 Elapsed time = 0.037 seconds 00:05:56.013 00:05:56.013 real 0m0.057s 00:05:56.013 user 0m0.014s 00:05:56.013 sys 0m0.042s 00:05:56.013 10:52:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.013 10:52:54 -- common/autotest_common.sh@10 -- # set +x 00:05:56.013 ************************************ 00:05:56.013 END TEST env_pci 00:05:56.013 ************************************ 00:05:56.013 10:52:54 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:56.013 10:52:54 -- env/env.sh@15 -- # uname 00:05:56.013 10:52:54 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:56.013 10:52:54 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:56.013 10:52:54 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:56.013 10:52:54 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:56.013 10:52:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.013 10:52:54 -- common/autotest_common.sh@10 -- # set +x 00:05:56.013 ************************************ 00:05:56.013 START TEST env_dpdk_post_init 00:05:56.013 ************************************ 00:05:56.013 10:52:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:56.013 EAL: Detected CPU lcores: 112 00:05:56.013 EAL: Detected NUMA nodes: 2 00:05:56.013 EAL: Detected static linkage of DPDK 00:05:56.013 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:56.013 EAL: Selected IOVA mode 'VA' 00:05:56.013 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.013 EAL: VFIO support initialized 00:05:56.013 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:56.013 EAL: Using IOMMU type 1 (Type 1) 00:05:56.959 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:00.248 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:00.248 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:00.507 Starting DPDK initialization... 00:06:00.507 Starting SPDK post initialization... 00:06:00.507 SPDK NVMe probe 00:06:00.507 Attaching to 0000:d8:00.0 00:06:00.507 Attached to 0000:d8:00.0 00:06:00.507 Cleaning up... 00:06:00.507 00:06:00.507 real 0m4.617s 00:06:00.507 user 0m3.527s 00:06:00.507 sys 0m0.338s 00:06:00.507 10:52:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:00.507 10:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.507 ************************************ 00:06:00.507 END TEST env_dpdk_post_init 00:06:00.507 ************************************ 00:06:00.767 10:52:59 -- env/env.sh@26 -- # uname 00:06:00.767 10:52:59 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:00.767 10:52:59 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:00.767 10:52:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:00.767 10:52:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.767 10:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.767 ************************************ 00:06:00.767 START TEST env_mem_callbacks 00:06:00.767 ************************************ 00:06:00.767 10:52:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:00.767 EAL: Detected CPU lcores: 112 00:06:00.767 EAL: Detected NUMA nodes: 2 00:06:00.767 EAL: Detected static linkage of DPDK 00:06:00.767 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:00.767 EAL: Selected IOVA mode 'VA' 00:06:00.767 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.767 EAL: VFIO support initialized 00:06:00.767 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:00.767 00:06:00.767 00:06:00.767 CUnit - A unit testing framework for C - Version 2.1-3 00:06:00.767 http://cunit.sourceforge.net/ 00:06:00.767 00:06:00.767 00:06:00.767 Suite: memory 00:06:00.767 Test: test ... 00:06:00.767 register 0x200000200000 2097152 00:06:00.767 malloc 3145728 00:06:00.767 register 0x200000400000 4194304 00:06:00.767 buf 0x200000500000 len 3145728 PASSED 00:06:00.767 malloc 64 00:06:00.767 buf 0x2000004fff40 len 64 PASSED 00:06:00.767 malloc 4194304 00:06:00.767 register 0x200000800000 6291456 00:06:00.767 buf 0x200000a00000 len 4194304 PASSED 00:06:00.767 free 0x200000500000 3145728 00:06:00.767 free 0x2000004fff40 64 00:06:00.767 unregister 0x200000400000 4194304 PASSED 00:06:00.767 free 0x200000a00000 4194304 00:06:00.767 unregister 0x200000800000 6291456 PASSED 00:06:00.767 malloc 8388608 00:06:00.767 register 0x200000400000 10485760 00:06:00.767 buf 0x200000600000 len 8388608 PASSED 00:06:00.767 free 0x200000600000 8388608 00:06:00.767 unregister 0x200000400000 10485760 PASSED 00:06:00.767 passed 00:06:00.767 00:06:00.767 Run Summary: Type Total Ran Passed Failed Inactive 00:06:00.767 suites 1 1 n/a 0 0 00:06:00.767 tests 1 1 1 0 0 00:06:00.767 asserts 15 15 15 0 n/a 00:06:00.767 00:06:00.767 Elapsed time = 0.005 seconds 00:06:00.767 00:06:00.767 real 0m0.060s 00:06:00.767 user 0m0.018s 00:06:00.767 sys 0m0.042s 00:06:00.767 10:52:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:00.767 10:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.767 ************************************ 00:06:00.767 END TEST env_mem_callbacks 00:06:00.767 ************************************ 00:06:00.767 00:06:00.767 real 0m6.355s 00:06:00.767 user 0m4.470s 00:06:00.767 sys 0m1.164s 00:06:00.767 10:52:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:00.767 10:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.767 ************************************ 00:06:00.767 END TEST env 00:06:00.767 ************************************ 00:06:00.767 10:52:59 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:00.767 10:52:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:00.767 10:52:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.767 10:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:00.767 ************************************ 00:06:00.767 START TEST rpc 00:06:00.767 ************************************ 00:06:00.767 10:52:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:00.767 * Looking for test storage... 00:06:00.767 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:00.767 10:52:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:00.767 10:52:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:00.767 10:52:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:01.027 10:52:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:01.027 10:52:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:01.027 10:52:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:01.027 10:52:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:01.027 10:52:59 -- scripts/common.sh@335 -- # IFS=.-: 00:06:01.027 10:52:59 -- scripts/common.sh@335 -- # read -ra ver1 00:06:01.027 10:52:59 -- scripts/common.sh@336 -- # IFS=.-: 00:06:01.027 10:52:59 -- scripts/common.sh@336 -- # read -ra ver2 00:06:01.027 10:52:59 -- scripts/common.sh@337 -- # local 'op=<' 00:06:01.027 10:52:59 -- scripts/common.sh@339 -- # ver1_l=2 00:06:01.027 10:52:59 -- scripts/common.sh@340 -- # ver2_l=1 00:06:01.027 10:52:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:01.027 10:52:59 -- scripts/common.sh@343 -- # case "$op" in 00:06:01.027 10:52:59 -- scripts/common.sh@344 -- # : 1 00:06:01.027 10:52:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:01.027 10:52:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:01.027 10:52:59 -- scripts/common.sh@364 -- # decimal 1 00:06:01.027 10:52:59 -- scripts/common.sh@352 -- # local d=1 00:06:01.027 10:52:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:01.027 10:52:59 -- scripts/common.sh@354 -- # echo 1 00:06:01.027 10:52:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:01.027 10:52:59 -- scripts/common.sh@365 -- # decimal 2 00:06:01.027 10:52:59 -- scripts/common.sh@352 -- # local d=2 00:06:01.027 10:52:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:01.027 10:52:59 -- scripts/common.sh@354 -- # echo 2 00:06:01.027 10:52:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:01.027 10:52:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:01.027 10:52:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:01.027 10:52:59 -- scripts/common.sh@367 -- # return 0 00:06:01.027 10:52:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:01.027 10:52:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:01.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.027 --rc genhtml_branch_coverage=1 00:06:01.027 --rc genhtml_function_coverage=1 00:06:01.027 --rc genhtml_legend=1 00:06:01.027 --rc geninfo_all_blocks=1 00:06:01.027 --rc geninfo_unexecuted_blocks=1 00:06:01.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.027 ' 00:06:01.027 10:52:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:01.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.027 --rc genhtml_branch_coverage=1 00:06:01.027 --rc genhtml_function_coverage=1 00:06:01.027 --rc genhtml_legend=1 00:06:01.027 --rc geninfo_all_blocks=1 00:06:01.027 --rc geninfo_unexecuted_blocks=1 00:06:01.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.027 ' 00:06:01.027 10:52:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:01.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.027 --rc genhtml_branch_coverage=1 00:06:01.027 --rc genhtml_function_coverage=1 00:06:01.027 --rc genhtml_legend=1 00:06:01.027 --rc geninfo_all_blocks=1 00:06:01.027 --rc geninfo_unexecuted_blocks=1 00:06:01.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.027 ' 00:06:01.027 10:52:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:01.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:01.027 --rc genhtml_branch_coverage=1 00:06:01.027 --rc genhtml_function_coverage=1 00:06:01.027 --rc genhtml_legend=1 00:06:01.027 --rc geninfo_all_blocks=1 00:06:01.027 --rc geninfo_unexecuted_blocks=1 00:06:01.027 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:01.027 ' 00:06:01.027 10:52:59 -- rpc/rpc.sh@65 -- # spdk_pid=628059 00:06:01.027 10:52:59 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:01.027 10:52:59 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:01.027 10:52:59 -- rpc/rpc.sh@67 -- # waitforlisten 628059 00:06:01.027 10:52:59 -- common/autotest_common.sh@829 -- # '[' -z 628059 ']' 00:06:01.027 10:52:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.027 10:52:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:01.027 10:52:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.027 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.027 10:52:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:01.027 10:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:01.027 [2024-12-16 10:52:59.489580] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:01.027 [2024-12-16 10:52:59.489656] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid628059 ] 00:06:01.027 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.027 [2024-12-16 10:52:59.556749] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.027 [2024-12-16 10:52:59.592462] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:01.027 [2024-12-16 10:52:59.592581] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:01.027 [2024-12-16 10:52:59.592593] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 628059' to capture a snapshot of events at runtime. 00:06:01.027 [2024-12-16 10:52:59.592602] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid628059 for offline analysis/debug. 00:06:01.027 [2024-12-16 10:52:59.592630] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.966 10:53:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:01.966 10:53:00 -- common/autotest_common.sh@862 -- # return 0 00:06:01.966 10:53:00 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:01.966 10:53:00 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:01.966 10:53:00 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:01.966 10:53:00 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:01.966 10:53:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:01.966 10:53:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.966 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.966 ************************************ 00:06:01.966 START TEST rpc_integrity 00:06:01.966 ************************************ 00:06:01.966 10:53:00 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:06:01.966 10:53:00 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:01.966 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.966 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.966 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.966 10:53:00 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:01.966 10:53:00 -- rpc/rpc.sh@13 -- # jq length 00:06:01.966 10:53:00 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:01.966 10:53:00 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:01.966 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.966 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.966 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.966 10:53:00 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:01.966 10:53:00 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:01.966 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.966 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.966 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.966 10:53:00 -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:01.966 { 00:06:01.966 "name": "Malloc0", 00:06:01.966 "aliases": [ 00:06:01.966 "0000d09c-792f-4b07-9039-9bf94d687096" 00:06:01.966 ], 00:06:01.966 "product_name": "Malloc disk", 00:06:01.966 "block_size": 512, 00:06:01.966 "num_blocks": 16384, 00:06:01.966 "uuid": "0000d09c-792f-4b07-9039-9bf94d687096", 00:06:01.966 "assigned_rate_limits": { 00:06:01.966 "rw_ios_per_sec": 0, 00:06:01.966 "rw_mbytes_per_sec": 0, 00:06:01.966 "r_mbytes_per_sec": 0, 00:06:01.966 "w_mbytes_per_sec": 0 00:06:01.966 }, 00:06:01.966 "claimed": false, 00:06:01.966 "zoned": false, 00:06:01.966 "supported_io_types": { 00:06:01.966 "read": true, 00:06:01.966 "write": true, 00:06:01.966 "unmap": true, 00:06:01.966 "write_zeroes": true, 00:06:01.966 "flush": true, 00:06:01.966 "reset": true, 00:06:01.966 "compare": false, 00:06:01.966 "compare_and_write": false, 00:06:01.966 "abort": true, 00:06:01.966 "nvme_admin": false, 00:06:01.966 "nvme_io": false 00:06:01.966 }, 00:06:01.966 "memory_domains": [ 00:06:01.966 { 00:06:01.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:01.967 "dma_device_type": 2 00:06:01.967 } 00:06:01.967 ], 00:06:01.967 "driver_specific": {} 00:06:01.967 } 00:06:01.967 ]' 00:06:01.967 10:53:00 -- rpc/rpc.sh@17 -- # jq length 00:06:01.967 10:53:00 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:01.967 10:53:00 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:01.967 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.967 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.967 [2024-12-16 10:53:00.469490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:01.967 [2024-12-16 10:53:00.469526] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:01.967 [2024-12-16 10:53:00.469547] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x43802d0 00:06:01.967 [2024-12-16 10:53:00.469558] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:01.967 [2024-12-16 10:53:00.470383] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:01.967 [2024-12-16 10:53:00.470406] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:01.967 Passthru0 00:06:01.967 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.967 10:53:00 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:01.967 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.967 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.967 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.967 10:53:00 -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:01.967 { 00:06:01.967 "name": "Malloc0", 00:06:01.967 "aliases": [ 00:06:01.967 "0000d09c-792f-4b07-9039-9bf94d687096" 00:06:01.967 ], 00:06:01.967 "product_name": "Malloc disk", 00:06:01.967 "block_size": 512, 00:06:01.967 "num_blocks": 16384, 00:06:01.967 "uuid": "0000d09c-792f-4b07-9039-9bf94d687096", 00:06:01.967 "assigned_rate_limits": { 00:06:01.967 "rw_ios_per_sec": 0, 00:06:01.967 "rw_mbytes_per_sec": 0, 00:06:01.967 "r_mbytes_per_sec": 0, 00:06:01.967 "w_mbytes_per_sec": 0 00:06:01.967 }, 00:06:01.967 "claimed": true, 00:06:01.967 "claim_type": "exclusive_write", 00:06:01.967 "zoned": false, 00:06:01.967 "supported_io_types": { 00:06:01.967 "read": true, 00:06:01.967 "write": true, 00:06:01.967 "unmap": true, 00:06:01.967 "write_zeroes": true, 00:06:01.967 "flush": true, 00:06:01.967 "reset": true, 00:06:01.967 "compare": false, 00:06:01.967 "compare_and_write": false, 00:06:01.967 "abort": true, 00:06:01.967 "nvme_admin": false, 00:06:01.967 "nvme_io": false 00:06:01.967 }, 00:06:01.967 "memory_domains": [ 00:06:01.967 { 00:06:01.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:01.967 "dma_device_type": 2 00:06:01.967 } 00:06:01.967 ], 00:06:01.967 "driver_specific": {} 00:06:01.967 }, 00:06:01.967 { 00:06:01.967 "name": "Passthru0", 00:06:01.967 "aliases": [ 00:06:01.967 "338ea582-8603-5cc8-a56f-cc57c6b2a435" 00:06:01.967 ], 00:06:01.967 "product_name": "passthru", 00:06:01.967 "block_size": 512, 00:06:01.967 "num_blocks": 16384, 00:06:01.967 "uuid": "338ea582-8603-5cc8-a56f-cc57c6b2a435", 00:06:01.967 "assigned_rate_limits": { 00:06:01.967 "rw_ios_per_sec": 0, 00:06:01.967 "rw_mbytes_per_sec": 0, 00:06:01.967 "r_mbytes_per_sec": 0, 00:06:01.967 "w_mbytes_per_sec": 0 00:06:01.967 }, 00:06:01.967 "claimed": false, 00:06:01.967 "zoned": false, 00:06:01.967 "supported_io_types": { 00:06:01.967 "read": true, 00:06:01.967 "write": true, 00:06:01.967 "unmap": true, 00:06:01.967 "write_zeroes": true, 00:06:01.967 "flush": true, 00:06:01.967 "reset": true, 00:06:01.967 "compare": false, 00:06:01.967 "compare_and_write": false, 00:06:01.967 "abort": true, 00:06:01.967 "nvme_admin": false, 00:06:01.967 "nvme_io": false 00:06:01.967 }, 00:06:01.967 "memory_domains": [ 00:06:01.967 { 00:06:01.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:01.967 "dma_device_type": 2 00:06:01.967 } 00:06:01.967 ], 00:06:01.967 "driver_specific": { 00:06:01.967 "passthru": { 00:06:01.967 "name": "Passthru0", 00:06:01.967 "base_bdev_name": "Malloc0" 00:06:01.967 } 00:06:01.967 } 00:06:01.967 } 00:06:01.967 ]' 00:06:01.967 10:53:00 -- rpc/rpc.sh@21 -- # jq length 00:06:01.967 10:53:00 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:01.967 10:53:00 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:01.967 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.967 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.967 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.967 10:53:00 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:01.967 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.967 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.967 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.967 10:53:00 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:01.967 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.967 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:01.967 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.967 10:53:00 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:01.967 10:53:00 -- rpc/rpc.sh@26 -- # jq length 00:06:02.226 10:53:00 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:02.226 00:06:02.226 real 0m0.274s 00:06:02.226 user 0m0.167s 00:06:02.226 sys 0m0.046s 00:06:02.227 10:53:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 ************************************ 00:06:02.227 END TEST rpc_integrity 00:06:02.227 ************************************ 00:06:02.227 10:53:00 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:02.227 10:53:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:02.227 10:53:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 ************************************ 00:06:02.227 START TEST rpc_plugins 00:06:02.227 ************************************ 00:06:02.227 10:53:00 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:06:02.227 10:53:00 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:02.227 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.227 10:53:00 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:02.227 10:53:00 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:02.227 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.227 10:53:00 -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:02.227 { 00:06:02.227 "name": "Malloc1", 00:06:02.227 "aliases": [ 00:06:02.227 "7b3f5e83-995e-4289-83fb-2c481ef8664c" 00:06:02.227 ], 00:06:02.227 "product_name": "Malloc disk", 00:06:02.227 "block_size": 4096, 00:06:02.227 "num_blocks": 256, 00:06:02.227 "uuid": "7b3f5e83-995e-4289-83fb-2c481ef8664c", 00:06:02.227 "assigned_rate_limits": { 00:06:02.227 "rw_ios_per_sec": 0, 00:06:02.227 "rw_mbytes_per_sec": 0, 00:06:02.227 "r_mbytes_per_sec": 0, 00:06:02.227 "w_mbytes_per_sec": 0 00:06:02.227 }, 00:06:02.227 "claimed": false, 00:06:02.227 "zoned": false, 00:06:02.227 "supported_io_types": { 00:06:02.227 "read": true, 00:06:02.227 "write": true, 00:06:02.227 "unmap": true, 00:06:02.227 "write_zeroes": true, 00:06:02.227 "flush": true, 00:06:02.227 "reset": true, 00:06:02.227 "compare": false, 00:06:02.227 "compare_and_write": false, 00:06:02.227 "abort": true, 00:06:02.227 "nvme_admin": false, 00:06:02.227 "nvme_io": false 00:06:02.227 }, 00:06:02.227 "memory_domains": [ 00:06:02.227 { 00:06:02.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:02.227 "dma_device_type": 2 00:06:02.227 } 00:06:02.227 ], 00:06:02.227 "driver_specific": {} 00:06:02.227 } 00:06:02.227 ]' 00:06:02.227 10:53:00 -- rpc/rpc.sh@32 -- # jq length 00:06:02.227 10:53:00 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:02.227 10:53:00 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:02.227 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.227 10:53:00 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:02.227 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.227 10:53:00 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:02.227 10:53:00 -- rpc/rpc.sh@36 -- # jq length 00:06:02.227 10:53:00 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:02.227 00:06:02.227 real 0m0.143s 00:06:02.227 user 0m0.085s 00:06:02.227 sys 0m0.025s 00:06:02.227 10:53:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 ************************************ 00:06:02.227 END TEST rpc_plugins 00:06:02.227 ************************************ 00:06:02.227 10:53:00 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:02.227 10:53:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:02.227 10:53:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.227 ************************************ 00:06:02.227 START TEST rpc_trace_cmd_test 00:06:02.227 ************************************ 00:06:02.227 10:53:00 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:06:02.227 10:53:00 -- rpc/rpc.sh@40 -- # local info 00:06:02.227 10:53:00 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:02.227 10:53:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.227 10:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:02.486 10:53:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.486 10:53:00 -- rpc/rpc.sh@42 -- # info='{ 00:06:02.486 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid628059", 00:06:02.486 "tpoint_group_mask": "0x8", 00:06:02.486 "iscsi_conn": { 00:06:02.486 "mask": "0x2", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "scsi": { 00:06:02.486 "mask": "0x4", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "bdev": { 00:06:02.486 "mask": "0x8", 00:06:02.486 "tpoint_mask": "0xffffffffffffffff" 00:06:02.486 }, 00:06:02.486 "nvmf_rdma": { 00:06:02.486 "mask": "0x10", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "nvmf_tcp": { 00:06:02.486 "mask": "0x20", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "ftl": { 00:06:02.486 "mask": "0x40", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "blobfs": { 00:06:02.486 "mask": "0x80", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "dsa": { 00:06:02.486 "mask": "0x200", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "thread": { 00:06:02.486 "mask": "0x400", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "nvme_pcie": { 00:06:02.486 "mask": "0x800", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "iaa": { 00:06:02.486 "mask": "0x1000", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "nvme_tcp": { 00:06:02.486 "mask": "0x2000", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 }, 00:06:02.486 "bdev_nvme": { 00:06:02.486 "mask": "0x4000", 00:06:02.486 "tpoint_mask": "0x0" 00:06:02.486 } 00:06:02.486 }' 00:06:02.486 10:53:00 -- rpc/rpc.sh@43 -- # jq length 00:06:02.486 10:53:00 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:06:02.486 10:53:00 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:02.486 10:53:00 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:02.486 10:53:00 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:02.486 10:53:00 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:02.486 10:53:00 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:02.486 10:53:01 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:02.486 10:53:01 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:02.486 10:53:01 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:02.486 00:06:02.486 real 0m0.212s 00:06:02.486 user 0m0.166s 00:06:02.486 sys 0m0.037s 00:06:02.486 10:53:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.486 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.486 ************************************ 00:06:02.486 END TEST rpc_trace_cmd_test 00:06:02.486 ************************************ 00:06:02.486 10:53:01 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:02.486 10:53:01 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:02.486 10:53:01 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:02.486 10:53:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:02.486 10:53:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.486 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.486 ************************************ 00:06:02.486 START TEST rpc_daemon_integrity 00:06:02.486 ************************************ 00:06:02.486 10:53:01 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:06:02.486 10:53:01 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:02.486 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.486 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.746 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.746 10:53:01 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:02.746 10:53:01 -- rpc/rpc.sh@13 -- # jq length 00:06:02.746 10:53:01 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:02.746 10:53:01 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:02.746 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.746 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.746 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.746 10:53:01 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:02.746 10:53:01 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:02.746 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.746 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.746 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.746 10:53:01 -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:02.746 { 00:06:02.746 "name": "Malloc2", 00:06:02.746 "aliases": [ 00:06:02.746 "eb9fe734-0348-4b0e-a4ca-5e699dba473c" 00:06:02.746 ], 00:06:02.746 "product_name": "Malloc disk", 00:06:02.746 "block_size": 512, 00:06:02.746 "num_blocks": 16384, 00:06:02.746 "uuid": "eb9fe734-0348-4b0e-a4ca-5e699dba473c", 00:06:02.746 "assigned_rate_limits": { 00:06:02.746 "rw_ios_per_sec": 0, 00:06:02.746 "rw_mbytes_per_sec": 0, 00:06:02.746 "r_mbytes_per_sec": 0, 00:06:02.746 "w_mbytes_per_sec": 0 00:06:02.746 }, 00:06:02.746 "claimed": false, 00:06:02.746 "zoned": false, 00:06:02.746 "supported_io_types": { 00:06:02.746 "read": true, 00:06:02.746 "write": true, 00:06:02.746 "unmap": true, 00:06:02.746 "write_zeroes": true, 00:06:02.746 "flush": true, 00:06:02.746 "reset": true, 00:06:02.746 "compare": false, 00:06:02.746 "compare_and_write": false, 00:06:02.746 "abort": true, 00:06:02.746 "nvme_admin": false, 00:06:02.746 "nvme_io": false 00:06:02.746 }, 00:06:02.746 "memory_domains": [ 00:06:02.746 { 00:06:02.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:02.746 "dma_device_type": 2 00:06:02.746 } 00:06:02.746 ], 00:06:02.746 "driver_specific": {} 00:06:02.746 } 00:06:02.746 ]' 00:06:02.746 10:53:01 -- rpc/rpc.sh@17 -- # jq length 00:06:02.746 10:53:01 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:02.746 10:53:01 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:02.746 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.746 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.746 [2024-12-16 10:53:01.247525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:02.746 [2024-12-16 10:53:01.247554] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:02.746 [2024-12-16 10:53:01.247570] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x437eec0 00:06:02.746 [2024-12-16 10:53:01.247579] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:02.746 [2024-12-16 10:53:01.248257] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:02.746 [2024-12-16 10:53:01.248280] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:02.746 Passthru0 00:06:02.746 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.746 10:53:01 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:02.746 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.746 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.746 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.746 10:53:01 -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:02.746 { 00:06:02.746 "name": "Malloc2", 00:06:02.746 "aliases": [ 00:06:02.746 "eb9fe734-0348-4b0e-a4ca-5e699dba473c" 00:06:02.746 ], 00:06:02.746 "product_name": "Malloc disk", 00:06:02.746 "block_size": 512, 00:06:02.746 "num_blocks": 16384, 00:06:02.746 "uuid": "eb9fe734-0348-4b0e-a4ca-5e699dba473c", 00:06:02.746 "assigned_rate_limits": { 00:06:02.746 "rw_ios_per_sec": 0, 00:06:02.746 "rw_mbytes_per_sec": 0, 00:06:02.746 "r_mbytes_per_sec": 0, 00:06:02.746 "w_mbytes_per_sec": 0 00:06:02.746 }, 00:06:02.746 "claimed": true, 00:06:02.746 "claim_type": "exclusive_write", 00:06:02.746 "zoned": false, 00:06:02.746 "supported_io_types": { 00:06:02.746 "read": true, 00:06:02.746 "write": true, 00:06:02.746 "unmap": true, 00:06:02.746 "write_zeroes": true, 00:06:02.746 "flush": true, 00:06:02.746 "reset": true, 00:06:02.746 "compare": false, 00:06:02.746 "compare_and_write": false, 00:06:02.746 "abort": true, 00:06:02.746 "nvme_admin": false, 00:06:02.746 "nvme_io": false 00:06:02.746 }, 00:06:02.746 "memory_domains": [ 00:06:02.746 { 00:06:02.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:02.746 "dma_device_type": 2 00:06:02.746 } 00:06:02.746 ], 00:06:02.746 "driver_specific": {} 00:06:02.746 }, 00:06:02.746 { 00:06:02.746 "name": "Passthru0", 00:06:02.746 "aliases": [ 00:06:02.746 "3136494b-63c1-5bf2-aab5-771a0086ac7b" 00:06:02.746 ], 00:06:02.746 "product_name": "passthru", 00:06:02.746 "block_size": 512, 00:06:02.747 "num_blocks": 16384, 00:06:02.747 "uuid": "3136494b-63c1-5bf2-aab5-771a0086ac7b", 00:06:02.747 "assigned_rate_limits": { 00:06:02.747 "rw_ios_per_sec": 0, 00:06:02.747 "rw_mbytes_per_sec": 0, 00:06:02.747 "r_mbytes_per_sec": 0, 00:06:02.747 "w_mbytes_per_sec": 0 00:06:02.747 }, 00:06:02.747 "claimed": false, 00:06:02.747 "zoned": false, 00:06:02.747 "supported_io_types": { 00:06:02.747 "read": true, 00:06:02.747 "write": true, 00:06:02.747 "unmap": true, 00:06:02.747 "write_zeroes": true, 00:06:02.747 "flush": true, 00:06:02.747 "reset": true, 00:06:02.747 "compare": false, 00:06:02.747 "compare_and_write": false, 00:06:02.747 "abort": true, 00:06:02.747 "nvme_admin": false, 00:06:02.747 "nvme_io": false 00:06:02.747 }, 00:06:02.747 "memory_domains": [ 00:06:02.747 { 00:06:02.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:02.747 "dma_device_type": 2 00:06:02.747 } 00:06:02.747 ], 00:06:02.747 "driver_specific": { 00:06:02.747 "passthru": { 00:06:02.747 "name": "Passthru0", 00:06:02.747 "base_bdev_name": "Malloc2" 00:06:02.747 } 00:06:02.747 } 00:06:02.747 } 00:06:02.747 ]' 00:06:02.747 10:53:01 -- rpc/rpc.sh@21 -- # jq length 00:06:02.747 10:53:01 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:02.747 10:53:01 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:02.747 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.747 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.747 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.747 10:53:01 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:02.747 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.747 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.747 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.747 10:53:01 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:02.747 10:53:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:02.747 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:02.747 10:53:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:02.747 10:53:01 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:02.747 10:53:01 -- rpc/rpc.sh@26 -- # jq length 00:06:03.006 10:53:01 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:03.006 00:06:03.006 real 0m0.295s 00:06:03.006 user 0m0.180s 00:06:03.006 sys 0m0.046s 00:06:03.006 10:53:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.006 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:03.006 ************************************ 00:06:03.006 END TEST rpc_daemon_integrity 00:06:03.006 ************************************ 00:06:03.006 10:53:01 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:03.006 10:53:01 -- rpc/rpc.sh@84 -- # killprocess 628059 00:06:03.006 10:53:01 -- common/autotest_common.sh@936 -- # '[' -z 628059 ']' 00:06:03.006 10:53:01 -- common/autotest_common.sh@940 -- # kill -0 628059 00:06:03.006 10:53:01 -- common/autotest_common.sh@941 -- # uname 00:06:03.006 10:53:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:03.006 10:53:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 628059 00:06:03.006 10:53:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:03.006 10:53:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:03.006 10:53:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 628059' 00:06:03.006 killing process with pid 628059 00:06:03.006 10:53:01 -- common/autotest_common.sh@955 -- # kill 628059 00:06:03.006 10:53:01 -- common/autotest_common.sh@960 -- # wait 628059 00:06:03.265 00:06:03.265 real 0m2.504s 00:06:03.265 user 0m3.141s 00:06:03.265 sys 0m0.750s 00:06:03.265 10:53:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.265 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:03.265 ************************************ 00:06:03.265 END TEST rpc 00:06:03.265 ************************************ 00:06:03.265 10:53:01 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:03.265 10:53:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:03.265 10:53:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.265 10:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:03.265 ************************************ 00:06:03.265 START TEST rpc_client 00:06:03.265 ************************************ 00:06:03.265 10:53:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:03.525 * Looking for test storage... 00:06:03.525 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:03.525 10:53:01 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:03.525 10:53:01 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:03.525 10:53:01 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:03.525 10:53:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:03.525 10:53:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:03.525 10:53:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:03.526 10:53:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:03.526 10:53:02 -- scripts/common.sh@335 -- # IFS=.-: 00:06:03.526 10:53:02 -- scripts/common.sh@335 -- # read -ra ver1 00:06:03.526 10:53:02 -- scripts/common.sh@336 -- # IFS=.-: 00:06:03.526 10:53:02 -- scripts/common.sh@336 -- # read -ra ver2 00:06:03.526 10:53:02 -- scripts/common.sh@337 -- # local 'op=<' 00:06:03.526 10:53:02 -- scripts/common.sh@339 -- # ver1_l=2 00:06:03.526 10:53:02 -- scripts/common.sh@340 -- # ver2_l=1 00:06:03.526 10:53:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:03.526 10:53:02 -- scripts/common.sh@343 -- # case "$op" in 00:06:03.526 10:53:02 -- scripts/common.sh@344 -- # : 1 00:06:03.526 10:53:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:03.526 10:53:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:03.526 10:53:02 -- scripts/common.sh@364 -- # decimal 1 00:06:03.526 10:53:02 -- scripts/common.sh@352 -- # local d=1 00:06:03.526 10:53:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:03.526 10:53:02 -- scripts/common.sh@354 -- # echo 1 00:06:03.526 10:53:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:03.526 10:53:02 -- scripts/common.sh@365 -- # decimal 2 00:06:03.526 10:53:02 -- scripts/common.sh@352 -- # local d=2 00:06:03.526 10:53:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:03.526 10:53:02 -- scripts/common.sh@354 -- # echo 2 00:06:03.526 10:53:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:03.526 10:53:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:03.526 10:53:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:03.526 10:53:02 -- scripts/common.sh@367 -- # return 0 00:06:03.526 10:53:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:03.526 10:53:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:03.526 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.526 --rc genhtml_branch_coverage=1 00:06:03.526 --rc genhtml_function_coverage=1 00:06:03.526 --rc genhtml_legend=1 00:06:03.526 --rc geninfo_all_blocks=1 00:06:03.526 --rc geninfo_unexecuted_blocks=1 00:06:03.526 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.526 ' 00:06:03.526 10:53:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:03.526 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.526 --rc genhtml_branch_coverage=1 00:06:03.526 --rc genhtml_function_coverage=1 00:06:03.526 --rc genhtml_legend=1 00:06:03.526 --rc geninfo_all_blocks=1 00:06:03.526 --rc geninfo_unexecuted_blocks=1 00:06:03.526 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.526 ' 00:06:03.526 10:53:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:03.526 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.526 --rc genhtml_branch_coverage=1 00:06:03.526 --rc genhtml_function_coverage=1 00:06:03.526 --rc genhtml_legend=1 00:06:03.526 --rc geninfo_all_blocks=1 00:06:03.526 --rc geninfo_unexecuted_blocks=1 00:06:03.526 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.526 ' 00:06:03.526 10:53:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:03.526 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.526 --rc genhtml_branch_coverage=1 00:06:03.526 --rc genhtml_function_coverage=1 00:06:03.526 --rc genhtml_legend=1 00:06:03.526 --rc geninfo_all_blocks=1 00:06:03.526 --rc geninfo_unexecuted_blocks=1 00:06:03.526 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.526 ' 00:06:03.526 10:53:02 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:03.526 OK 00:06:03.526 10:53:02 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:03.526 00:06:03.526 real 0m0.209s 00:06:03.526 user 0m0.126s 00:06:03.526 sys 0m0.100s 00:06:03.526 10:53:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.526 10:53:02 -- common/autotest_common.sh@10 -- # set +x 00:06:03.526 ************************************ 00:06:03.526 END TEST rpc_client 00:06:03.526 ************************************ 00:06:03.526 10:53:02 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:03.526 10:53:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:03.526 10:53:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.526 10:53:02 -- common/autotest_common.sh@10 -- # set +x 00:06:03.526 ************************************ 00:06:03.526 START TEST json_config 00:06:03.526 ************************************ 00:06:03.526 10:53:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:03.786 10:53:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:03.786 10:53:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:03.786 10:53:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:03.786 10:53:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:03.786 10:53:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:03.786 10:53:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:03.786 10:53:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:03.786 10:53:02 -- scripts/common.sh@335 -- # IFS=.-: 00:06:03.786 10:53:02 -- scripts/common.sh@335 -- # read -ra ver1 00:06:03.786 10:53:02 -- scripts/common.sh@336 -- # IFS=.-: 00:06:03.786 10:53:02 -- scripts/common.sh@336 -- # read -ra ver2 00:06:03.786 10:53:02 -- scripts/common.sh@337 -- # local 'op=<' 00:06:03.786 10:53:02 -- scripts/common.sh@339 -- # ver1_l=2 00:06:03.786 10:53:02 -- scripts/common.sh@340 -- # ver2_l=1 00:06:03.786 10:53:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:03.786 10:53:02 -- scripts/common.sh@343 -- # case "$op" in 00:06:03.786 10:53:02 -- scripts/common.sh@344 -- # : 1 00:06:03.786 10:53:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:03.786 10:53:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:03.786 10:53:02 -- scripts/common.sh@364 -- # decimal 1 00:06:03.786 10:53:02 -- scripts/common.sh@352 -- # local d=1 00:06:03.786 10:53:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:03.786 10:53:02 -- scripts/common.sh@354 -- # echo 1 00:06:03.786 10:53:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:03.786 10:53:02 -- scripts/common.sh@365 -- # decimal 2 00:06:03.786 10:53:02 -- scripts/common.sh@352 -- # local d=2 00:06:03.786 10:53:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:03.786 10:53:02 -- scripts/common.sh@354 -- # echo 2 00:06:03.786 10:53:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:03.786 10:53:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:03.786 10:53:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:03.786 10:53:02 -- scripts/common.sh@367 -- # return 0 00:06:03.786 10:53:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:03.786 10:53:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:03.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.786 --rc genhtml_branch_coverage=1 00:06:03.786 --rc genhtml_function_coverage=1 00:06:03.786 --rc genhtml_legend=1 00:06:03.786 --rc geninfo_all_blocks=1 00:06:03.786 --rc geninfo_unexecuted_blocks=1 00:06:03.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.786 ' 00:06:03.786 10:53:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:03.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.786 --rc genhtml_branch_coverage=1 00:06:03.786 --rc genhtml_function_coverage=1 00:06:03.786 --rc genhtml_legend=1 00:06:03.786 --rc geninfo_all_blocks=1 00:06:03.786 --rc geninfo_unexecuted_blocks=1 00:06:03.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.786 ' 00:06:03.786 10:53:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:03.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.786 --rc genhtml_branch_coverage=1 00:06:03.786 --rc genhtml_function_coverage=1 00:06:03.786 --rc genhtml_legend=1 00:06:03.786 --rc geninfo_all_blocks=1 00:06:03.786 --rc geninfo_unexecuted_blocks=1 00:06:03.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.786 ' 00:06:03.786 10:53:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:03.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.787 --rc genhtml_branch_coverage=1 00:06:03.787 --rc genhtml_function_coverage=1 00:06:03.787 --rc genhtml_legend=1 00:06:03.787 --rc geninfo_all_blocks=1 00:06:03.787 --rc geninfo_unexecuted_blocks=1 00:06:03.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.787 ' 00:06:03.787 10:53:02 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:03.787 10:53:02 -- nvmf/common.sh@7 -- # uname -s 00:06:03.787 10:53:02 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:03.787 10:53:02 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:03.787 10:53:02 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:03.787 10:53:02 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:03.787 10:53:02 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:03.787 10:53:02 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:03.787 10:53:02 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:03.787 10:53:02 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:03.787 10:53:02 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:03.787 10:53:02 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:03.787 10:53:02 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:03.787 10:53:02 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:03.787 10:53:02 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:03.787 10:53:02 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:03.787 10:53:02 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:03.787 10:53:02 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:03.787 10:53:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:03.787 10:53:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:03.787 10:53:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:03.787 10:53:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:03.787 10:53:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:03.787 10:53:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:03.787 10:53:02 -- paths/export.sh@5 -- # export PATH 00:06:03.787 10:53:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:03.787 10:53:02 -- nvmf/common.sh@46 -- # : 0 00:06:03.787 10:53:02 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:06:03.787 10:53:02 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:06:03.787 10:53:02 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:06:03.787 10:53:02 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:03.787 10:53:02 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:03.787 10:53:02 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:06:03.787 10:53:02 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:06:03.787 10:53:02 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:06:03.787 10:53:02 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:06:03.787 10:53:02 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:06:03.787 10:53:02 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:06:03.787 10:53:02 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:03.787 10:53:02 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:03.787 WARNING: No tests are enabled so not running JSON configuration tests 00:06:03.787 10:53:02 -- json_config/json_config.sh@27 -- # exit 0 00:06:03.787 00:06:03.787 real 0m0.188s 00:06:03.787 user 0m0.115s 00:06:03.787 sys 0m0.081s 00:06:03.787 10:53:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.787 10:53:02 -- common/autotest_common.sh@10 -- # set +x 00:06:03.787 ************************************ 00:06:03.787 END TEST json_config 00:06:03.787 ************************************ 00:06:03.787 10:53:02 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:03.787 10:53:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:03.787 10:53:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.787 10:53:02 -- common/autotest_common.sh@10 -- # set +x 00:06:03.787 ************************************ 00:06:03.787 START TEST json_config_extra_key 00:06:03.787 ************************************ 00:06:03.787 10:53:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:03.787 10:53:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:03.787 10:53:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:03.787 10:53:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:04.047 10:53:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:04.047 10:53:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:04.047 10:53:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:04.047 10:53:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:04.047 10:53:02 -- scripts/common.sh@335 -- # IFS=.-: 00:06:04.047 10:53:02 -- scripts/common.sh@335 -- # read -ra ver1 00:06:04.047 10:53:02 -- scripts/common.sh@336 -- # IFS=.-: 00:06:04.047 10:53:02 -- scripts/common.sh@336 -- # read -ra ver2 00:06:04.047 10:53:02 -- scripts/common.sh@337 -- # local 'op=<' 00:06:04.047 10:53:02 -- scripts/common.sh@339 -- # ver1_l=2 00:06:04.047 10:53:02 -- scripts/common.sh@340 -- # ver2_l=1 00:06:04.047 10:53:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:04.047 10:53:02 -- scripts/common.sh@343 -- # case "$op" in 00:06:04.047 10:53:02 -- scripts/common.sh@344 -- # : 1 00:06:04.047 10:53:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:04.047 10:53:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:04.047 10:53:02 -- scripts/common.sh@364 -- # decimal 1 00:06:04.047 10:53:02 -- scripts/common.sh@352 -- # local d=1 00:06:04.047 10:53:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:04.047 10:53:02 -- scripts/common.sh@354 -- # echo 1 00:06:04.047 10:53:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:04.047 10:53:02 -- scripts/common.sh@365 -- # decimal 2 00:06:04.047 10:53:02 -- scripts/common.sh@352 -- # local d=2 00:06:04.047 10:53:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:04.047 10:53:02 -- scripts/common.sh@354 -- # echo 2 00:06:04.047 10:53:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:04.047 10:53:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:04.047 10:53:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:04.047 10:53:02 -- scripts/common.sh@367 -- # return 0 00:06:04.047 10:53:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:04.047 10:53:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:04.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.047 --rc genhtml_branch_coverage=1 00:06:04.047 --rc genhtml_function_coverage=1 00:06:04.047 --rc genhtml_legend=1 00:06:04.047 --rc geninfo_all_blocks=1 00:06:04.047 --rc geninfo_unexecuted_blocks=1 00:06:04.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.047 ' 00:06:04.047 10:53:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:04.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.047 --rc genhtml_branch_coverage=1 00:06:04.047 --rc genhtml_function_coverage=1 00:06:04.047 --rc genhtml_legend=1 00:06:04.047 --rc geninfo_all_blocks=1 00:06:04.047 --rc geninfo_unexecuted_blocks=1 00:06:04.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.047 ' 00:06:04.047 10:53:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:04.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.047 --rc genhtml_branch_coverage=1 00:06:04.047 --rc genhtml_function_coverage=1 00:06:04.047 --rc genhtml_legend=1 00:06:04.047 --rc geninfo_all_blocks=1 00:06:04.047 --rc geninfo_unexecuted_blocks=1 00:06:04.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.047 ' 00:06:04.047 10:53:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:04.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.047 --rc genhtml_branch_coverage=1 00:06:04.047 --rc genhtml_function_coverage=1 00:06:04.047 --rc genhtml_legend=1 00:06:04.047 --rc geninfo_all_blocks=1 00:06:04.047 --rc geninfo_unexecuted_blocks=1 00:06:04.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:04.047 ' 00:06:04.047 10:53:02 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:04.047 10:53:02 -- nvmf/common.sh@7 -- # uname -s 00:06:04.047 10:53:02 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:04.047 10:53:02 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:04.047 10:53:02 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:04.047 10:53:02 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:04.047 10:53:02 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:04.047 10:53:02 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:04.048 10:53:02 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:04.048 10:53:02 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:04.048 10:53:02 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:04.048 10:53:02 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:04.048 10:53:02 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:04.048 10:53:02 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:04.048 10:53:02 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:04.048 10:53:02 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:04.048 10:53:02 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:04.048 10:53:02 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:04.048 10:53:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:04.048 10:53:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:04.048 10:53:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:04.048 10:53:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:04.048 10:53:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:04.048 10:53:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:04.048 10:53:02 -- paths/export.sh@5 -- # export PATH 00:06:04.048 10:53:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:04.048 10:53:02 -- nvmf/common.sh@46 -- # : 0 00:06:04.048 10:53:02 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:06:04.048 10:53:02 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:06:04.048 10:53:02 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:06:04.048 10:53:02 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:04.048 10:53:02 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:04.048 10:53:02 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:06:04.048 10:53:02 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:06:04.048 10:53:02 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:06:04.048 INFO: launching applications... 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@25 -- # shift 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=628855 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:06:04.048 Waiting for target to run... 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 628855 /var/tmp/spdk_tgt.sock 00:06:04.048 10:53:02 -- common/autotest_common.sh@829 -- # '[' -z 628855 ']' 00:06:04.048 10:53:02 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:04.048 10:53:02 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:04.048 10:53:02 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:04.048 10:53:02 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:04.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:04.048 10:53:02 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:04.048 10:53:02 -- common/autotest_common.sh@10 -- # set +x 00:06:04.048 [2024-12-16 10:53:02.547462] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:04.048 [2024-12-16 10:53:02.547552] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid628855 ] 00:06:04.048 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.307 [2024-12-16 10:53:02.828888] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.307 [2024-12-16 10:53:02.848271] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:04.307 [2024-12-16 10:53:02.848363] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.876 10:53:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.876 10:53:03 -- common/autotest_common.sh@862 -- # return 0 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:06:04.876 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:06:04.876 INFO: shutting down applications... 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 628855 ]] 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 628855 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@50 -- # kill -0 628855 00:06:04.876 10:53:03 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@50 -- # kill -0 628855 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@52 -- # break 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:06:05.445 SPDK target shutdown done 00:06:05.445 10:53:03 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:06:05.445 Success 00:06:05.445 00:06:05.445 real 0m1.549s 00:06:05.445 user 0m1.276s 00:06:05.445 sys 0m0.418s 00:06:05.445 10:53:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:05.445 10:53:03 -- common/autotest_common.sh@10 -- # set +x 00:06:05.445 ************************************ 00:06:05.445 END TEST json_config_extra_key 00:06:05.445 ************************************ 00:06:05.445 10:53:03 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:05.445 10:53:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:05.445 10:53:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.445 10:53:03 -- common/autotest_common.sh@10 -- # set +x 00:06:05.445 ************************************ 00:06:05.445 START TEST alias_rpc 00:06:05.445 ************************************ 00:06:05.445 10:53:03 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:05.445 * Looking for test storage... 00:06:05.445 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:05.445 10:53:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:05.445 10:53:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:05.445 10:53:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:05.705 10:53:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:05.705 10:53:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:05.705 10:53:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:05.705 10:53:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:05.705 10:53:04 -- scripts/common.sh@335 -- # IFS=.-: 00:06:05.705 10:53:04 -- scripts/common.sh@335 -- # read -ra ver1 00:06:05.705 10:53:04 -- scripts/common.sh@336 -- # IFS=.-: 00:06:05.705 10:53:04 -- scripts/common.sh@336 -- # read -ra ver2 00:06:05.705 10:53:04 -- scripts/common.sh@337 -- # local 'op=<' 00:06:05.705 10:53:04 -- scripts/common.sh@339 -- # ver1_l=2 00:06:05.705 10:53:04 -- scripts/common.sh@340 -- # ver2_l=1 00:06:05.705 10:53:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:05.705 10:53:04 -- scripts/common.sh@343 -- # case "$op" in 00:06:05.705 10:53:04 -- scripts/common.sh@344 -- # : 1 00:06:05.705 10:53:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:05.705 10:53:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:05.705 10:53:04 -- scripts/common.sh@364 -- # decimal 1 00:06:05.705 10:53:04 -- scripts/common.sh@352 -- # local d=1 00:06:05.705 10:53:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:05.705 10:53:04 -- scripts/common.sh@354 -- # echo 1 00:06:05.705 10:53:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:05.705 10:53:04 -- scripts/common.sh@365 -- # decimal 2 00:06:05.705 10:53:04 -- scripts/common.sh@352 -- # local d=2 00:06:05.705 10:53:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:05.705 10:53:04 -- scripts/common.sh@354 -- # echo 2 00:06:05.705 10:53:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:05.705 10:53:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:05.705 10:53:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:05.705 10:53:04 -- scripts/common.sh@367 -- # return 0 00:06:05.705 10:53:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:05.705 10:53:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:05.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.705 --rc genhtml_branch_coverage=1 00:06:05.705 --rc genhtml_function_coverage=1 00:06:05.705 --rc genhtml_legend=1 00:06:05.705 --rc geninfo_all_blocks=1 00:06:05.705 --rc geninfo_unexecuted_blocks=1 00:06:05.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.705 ' 00:06:05.705 10:53:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:05.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.705 --rc genhtml_branch_coverage=1 00:06:05.705 --rc genhtml_function_coverage=1 00:06:05.705 --rc genhtml_legend=1 00:06:05.705 --rc geninfo_all_blocks=1 00:06:05.705 --rc geninfo_unexecuted_blocks=1 00:06:05.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.705 ' 00:06:05.705 10:53:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:05.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.705 --rc genhtml_branch_coverage=1 00:06:05.705 --rc genhtml_function_coverage=1 00:06:05.705 --rc genhtml_legend=1 00:06:05.705 --rc geninfo_all_blocks=1 00:06:05.705 --rc geninfo_unexecuted_blocks=1 00:06:05.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.705 ' 00:06:05.705 10:53:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:05.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.705 --rc genhtml_branch_coverage=1 00:06:05.705 --rc genhtml_function_coverage=1 00:06:05.705 --rc genhtml_legend=1 00:06:05.705 --rc geninfo_all_blocks=1 00:06:05.705 --rc geninfo_unexecuted_blocks=1 00:06:05.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.705 ' 00:06:05.705 10:53:04 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:05.705 10:53:04 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=629183 00:06:05.705 10:53:04 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.705 10:53:04 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 629183 00:06:05.705 10:53:04 -- common/autotest_common.sh@829 -- # '[' -z 629183 ']' 00:06:05.705 10:53:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:05.705 10:53:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:05.705 10:53:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:05.705 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:05.705 10:53:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:05.705 10:53:04 -- common/autotest_common.sh@10 -- # set +x 00:06:05.705 [2024-12-16 10:53:04.141684] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:05.705 [2024-12-16 10:53:04.141771] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629183 ] 00:06:05.705 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.705 [2024-12-16 10:53:04.209887] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.705 [2024-12-16 10:53:04.245419] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:05.705 [2024-12-16 10:53:04.245530] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.649 10:53:04 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:06.649 10:53:04 -- common/autotest_common.sh@862 -- # return 0 00:06:06.649 10:53:04 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:06.649 10:53:05 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 629183 00:06:06.649 10:53:05 -- common/autotest_common.sh@936 -- # '[' -z 629183 ']' 00:06:06.649 10:53:05 -- common/autotest_common.sh@940 -- # kill -0 629183 00:06:06.649 10:53:05 -- common/autotest_common.sh@941 -- # uname 00:06:06.649 10:53:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:06.649 10:53:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 629183 00:06:06.649 10:53:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:06.649 10:53:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:06.649 10:53:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 629183' 00:06:06.649 killing process with pid 629183 00:06:06.649 10:53:05 -- common/autotest_common.sh@955 -- # kill 629183 00:06:06.649 10:53:05 -- common/autotest_common.sh@960 -- # wait 629183 00:06:07.219 00:06:07.219 real 0m1.611s 00:06:07.219 user 0m1.724s 00:06:07.219 sys 0m0.476s 00:06:07.219 10:53:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:07.219 10:53:05 -- common/autotest_common.sh@10 -- # set +x 00:06:07.219 ************************************ 00:06:07.219 END TEST alias_rpc 00:06:07.219 ************************************ 00:06:07.219 10:53:05 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:06:07.219 10:53:05 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:07.219 10:53:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:07.219 10:53:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.219 10:53:05 -- common/autotest_common.sh@10 -- # set +x 00:06:07.219 ************************************ 00:06:07.219 START TEST spdkcli_tcp 00:06:07.219 ************************************ 00:06:07.219 10:53:05 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:07.219 * Looking for test storage... 00:06:07.219 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:07.219 10:53:05 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:07.219 10:53:05 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:07.219 10:53:05 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:07.219 10:53:05 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:07.219 10:53:05 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:07.219 10:53:05 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:07.219 10:53:05 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:07.219 10:53:05 -- scripts/common.sh@335 -- # IFS=.-: 00:06:07.219 10:53:05 -- scripts/common.sh@335 -- # read -ra ver1 00:06:07.219 10:53:05 -- scripts/common.sh@336 -- # IFS=.-: 00:06:07.219 10:53:05 -- scripts/common.sh@336 -- # read -ra ver2 00:06:07.219 10:53:05 -- scripts/common.sh@337 -- # local 'op=<' 00:06:07.219 10:53:05 -- scripts/common.sh@339 -- # ver1_l=2 00:06:07.219 10:53:05 -- scripts/common.sh@340 -- # ver2_l=1 00:06:07.219 10:53:05 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:07.219 10:53:05 -- scripts/common.sh@343 -- # case "$op" in 00:06:07.219 10:53:05 -- scripts/common.sh@344 -- # : 1 00:06:07.219 10:53:05 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:07.219 10:53:05 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:07.219 10:53:05 -- scripts/common.sh@364 -- # decimal 1 00:06:07.219 10:53:05 -- scripts/common.sh@352 -- # local d=1 00:06:07.219 10:53:05 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:07.219 10:53:05 -- scripts/common.sh@354 -- # echo 1 00:06:07.219 10:53:05 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:07.219 10:53:05 -- scripts/common.sh@365 -- # decimal 2 00:06:07.219 10:53:05 -- scripts/common.sh@352 -- # local d=2 00:06:07.219 10:53:05 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:07.219 10:53:05 -- scripts/common.sh@354 -- # echo 2 00:06:07.219 10:53:05 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:07.219 10:53:05 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:07.219 10:53:05 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:07.219 10:53:05 -- scripts/common.sh@367 -- # return 0 00:06:07.220 10:53:05 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:07.220 10:53:05 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:07.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.220 --rc genhtml_branch_coverage=1 00:06:07.220 --rc genhtml_function_coverage=1 00:06:07.220 --rc genhtml_legend=1 00:06:07.220 --rc geninfo_all_blocks=1 00:06:07.220 --rc geninfo_unexecuted_blocks=1 00:06:07.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.220 ' 00:06:07.220 10:53:05 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:07.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.220 --rc genhtml_branch_coverage=1 00:06:07.220 --rc genhtml_function_coverage=1 00:06:07.220 --rc genhtml_legend=1 00:06:07.220 --rc geninfo_all_blocks=1 00:06:07.220 --rc geninfo_unexecuted_blocks=1 00:06:07.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.220 ' 00:06:07.220 10:53:05 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:07.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.220 --rc genhtml_branch_coverage=1 00:06:07.220 --rc genhtml_function_coverage=1 00:06:07.220 --rc genhtml_legend=1 00:06:07.220 --rc geninfo_all_blocks=1 00:06:07.220 --rc geninfo_unexecuted_blocks=1 00:06:07.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.220 ' 00:06:07.220 10:53:05 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:07.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.220 --rc genhtml_branch_coverage=1 00:06:07.220 --rc genhtml_function_coverage=1 00:06:07.220 --rc genhtml_legend=1 00:06:07.220 --rc geninfo_all_blocks=1 00:06:07.220 --rc geninfo_unexecuted_blocks=1 00:06:07.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.220 ' 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:07.220 10:53:05 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:07.220 10:53:05 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:07.220 10:53:05 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:07.220 10:53:05 -- common/autotest_common.sh@10 -- # set +x 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=629515 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:07.220 10:53:05 -- spdkcli/tcp.sh@27 -- # waitforlisten 629515 00:06:07.220 10:53:05 -- common/autotest_common.sh@829 -- # '[' -z 629515 ']' 00:06:07.220 10:53:05 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:07.220 10:53:05 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:07.220 10:53:05 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:07.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:07.220 10:53:05 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:07.220 10:53:05 -- common/autotest_common.sh@10 -- # set +x 00:06:07.220 [2024-12-16 10:53:05.805916] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:07.220 [2024-12-16 10:53:05.805980] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629515 ] 00:06:07.220 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.479 [2024-12-16 10:53:05.871974] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:07.479 [2024-12-16 10:53:05.910006] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:07.479 [2024-12-16 10:53:05.910155] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:07.479 [2024-12-16 10:53:05.910157] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.048 10:53:06 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:08.048 10:53:06 -- common/autotest_common.sh@862 -- # return 0 00:06:08.048 10:53:06 -- spdkcli/tcp.sh@31 -- # socat_pid=629716 00:06:08.048 10:53:06 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:08.048 10:53:06 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:08.308 [ 00:06:08.308 "spdk_get_version", 00:06:08.308 "rpc_get_methods", 00:06:08.308 "trace_get_info", 00:06:08.308 "trace_get_tpoint_group_mask", 00:06:08.308 "trace_disable_tpoint_group", 00:06:08.308 "trace_enable_tpoint_group", 00:06:08.308 "trace_clear_tpoint_mask", 00:06:08.308 "trace_set_tpoint_mask", 00:06:08.308 "vfu_tgt_set_base_path", 00:06:08.308 "framework_get_pci_devices", 00:06:08.308 "framework_get_config", 00:06:08.308 "framework_get_subsystems", 00:06:08.308 "iobuf_get_stats", 00:06:08.308 "iobuf_set_options", 00:06:08.308 "sock_set_default_impl", 00:06:08.308 "sock_impl_set_options", 00:06:08.308 "sock_impl_get_options", 00:06:08.308 "vmd_rescan", 00:06:08.308 "vmd_remove_device", 00:06:08.308 "vmd_enable", 00:06:08.308 "accel_get_stats", 00:06:08.308 "accel_set_options", 00:06:08.308 "accel_set_driver", 00:06:08.308 "accel_crypto_key_destroy", 00:06:08.308 "accel_crypto_keys_get", 00:06:08.308 "accel_crypto_key_create", 00:06:08.308 "accel_assign_opc", 00:06:08.308 "accel_get_module_info", 00:06:08.308 "accel_get_opc_assignments", 00:06:08.308 "notify_get_notifications", 00:06:08.308 "notify_get_types", 00:06:08.308 "bdev_get_histogram", 00:06:08.308 "bdev_enable_histogram", 00:06:08.308 "bdev_set_qos_limit", 00:06:08.308 "bdev_set_qd_sampling_period", 00:06:08.308 "bdev_get_bdevs", 00:06:08.308 "bdev_reset_iostat", 00:06:08.308 "bdev_get_iostat", 00:06:08.308 "bdev_examine", 00:06:08.308 "bdev_wait_for_examine", 00:06:08.308 "bdev_set_options", 00:06:08.308 "scsi_get_devices", 00:06:08.308 "thread_set_cpumask", 00:06:08.308 "framework_get_scheduler", 00:06:08.308 "framework_set_scheduler", 00:06:08.308 "framework_get_reactors", 00:06:08.308 "thread_get_io_channels", 00:06:08.308 "thread_get_pollers", 00:06:08.308 "thread_get_stats", 00:06:08.308 "framework_monitor_context_switch", 00:06:08.308 "spdk_kill_instance", 00:06:08.308 "log_enable_timestamps", 00:06:08.308 "log_get_flags", 00:06:08.308 "log_clear_flag", 00:06:08.308 "log_set_flag", 00:06:08.308 "log_get_level", 00:06:08.308 "log_set_level", 00:06:08.308 "log_get_print_level", 00:06:08.308 "log_set_print_level", 00:06:08.308 "framework_enable_cpumask_locks", 00:06:08.308 "framework_disable_cpumask_locks", 00:06:08.308 "framework_wait_init", 00:06:08.308 "framework_start_init", 00:06:08.308 "virtio_blk_create_transport", 00:06:08.308 "virtio_blk_get_transports", 00:06:08.308 "vhost_controller_set_coalescing", 00:06:08.308 "vhost_get_controllers", 00:06:08.308 "vhost_delete_controller", 00:06:08.308 "vhost_create_blk_controller", 00:06:08.308 "vhost_scsi_controller_remove_target", 00:06:08.308 "vhost_scsi_controller_add_target", 00:06:08.308 "vhost_start_scsi_controller", 00:06:08.308 "vhost_create_scsi_controller", 00:06:08.308 "ublk_recover_disk", 00:06:08.308 "ublk_get_disks", 00:06:08.308 "ublk_stop_disk", 00:06:08.308 "ublk_start_disk", 00:06:08.308 "ublk_destroy_target", 00:06:08.308 "ublk_create_target", 00:06:08.308 "nbd_get_disks", 00:06:08.308 "nbd_stop_disk", 00:06:08.308 "nbd_start_disk", 00:06:08.308 "env_dpdk_get_mem_stats", 00:06:08.308 "nvmf_subsystem_get_listeners", 00:06:08.308 "nvmf_subsystem_get_qpairs", 00:06:08.308 "nvmf_subsystem_get_controllers", 00:06:08.308 "nvmf_get_stats", 00:06:08.308 "nvmf_get_transports", 00:06:08.308 "nvmf_create_transport", 00:06:08.308 "nvmf_get_targets", 00:06:08.308 "nvmf_delete_target", 00:06:08.308 "nvmf_create_target", 00:06:08.308 "nvmf_subsystem_allow_any_host", 00:06:08.308 "nvmf_subsystem_remove_host", 00:06:08.308 "nvmf_subsystem_add_host", 00:06:08.308 "nvmf_subsystem_remove_ns", 00:06:08.308 "nvmf_subsystem_add_ns", 00:06:08.308 "nvmf_subsystem_listener_set_ana_state", 00:06:08.308 "nvmf_discovery_get_referrals", 00:06:08.308 "nvmf_discovery_remove_referral", 00:06:08.308 "nvmf_discovery_add_referral", 00:06:08.308 "nvmf_subsystem_remove_listener", 00:06:08.308 "nvmf_subsystem_add_listener", 00:06:08.308 "nvmf_delete_subsystem", 00:06:08.308 "nvmf_create_subsystem", 00:06:08.308 "nvmf_get_subsystems", 00:06:08.308 "nvmf_set_crdt", 00:06:08.308 "nvmf_set_config", 00:06:08.308 "nvmf_set_max_subsystems", 00:06:08.308 "iscsi_set_options", 00:06:08.308 "iscsi_get_auth_groups", 00:06:08.308 "iscsi_auth_group_remove_secret", 00:06:08.308 "iscsi_auth_group_add_secret", 00:06:08.308 "iscsi_delete_auth_group", 00:06:08.308 "iscsi_create_auth_group", 00:06:08.308 "iscsi_set_discovery_auth", 00:06:08.308 "iscsi_get_options", 00:06:08.308 "iscsi_target_node_request_logout", 00:06:08.308 "iscsi_target_node_set_redirect", 00:06:08.308 "iscsi_target_node_set_auth", 00:06:08.308 "iscsi_target_node_add_lun", 00:06:08.308 "iscsi_get_connections", 00:06:08.308 "iscsi_portal_group_set_auth", 00:06:08.308 "iscsi_start_portal_group", 00:06:08.308 "iscsi_delete_portal_group", 00:06:08.308 "iscsi_create_portal_group", 00:06:08.308 "iscsi_get_portal_groups", 00:06:08.308 "iscsi_delete_target_node", 00:06:08.308 "iscsi_target_node_remove_pg_ig_maps", 00:06:08.308 "iscsi_target_node_add_pg_ig_maps", 00:06:08.308 "iscsi_create_target_node", 00:06:08.308 "iscsi_get_target_nodes", 00:06:08.308 "iscsi_delete_initiator_group", 00:06:08.308 "iscsi_initiator_group_remove_initiators", 00:06:08.308 "iscsi_initiator_group_add_initiators", 00:06:08.308 "iscsi_create_initiator_group", 00:06:08.308 "iscsi_get_initiator_groups", 00:06:08.308 "vfu_virtio_create_scsi_endpoint", 00:06:08.308 "vfu_virtio_scsi_remove_target", 00:06:08.308 "vfu_virtio_scsi_add_target", 00:06:08.308 "vfu_virtio_create_blk_endpoint", 00:06:08.308 "vfu_virtio_delete_endpoint", 00:06:08.308 "iaa_scan_accel_module", 00:06:08.308 "dsa_scan_accel_module", 00:06:08.308 "ioat_scan_accel_module", 00:06:08.308 "accel_error_inject_error", 00:06:08.308 "bdev_iscsi_delete", 00:06:08.308 "bdev_iscsi_create", 00:06:08.308 "bdev_iscsi_set_options", 00:06:08.308 "bdev_virtio_attach_controller", 00:06:08.308 "bdev_virtio_scsi_get_devices", 00:06:08.308 "bdev_virtio_detach_controller", 00:06:08.308 "bdev_virtio_blk_set_hotplug", 00:06:08.308 "bdev_ftl_set_property", 00:06:08.308 "bdev_ftl_get_properties", 00:06:08.308 "bdev_ftl_get_stats", 00:06:08.308 "bdev_ftl_unmap", 00:06:08.308 "bdev_ftl_unload", 00:06:08.308 "bdev_ftl_delete", 00:06:08.308 "bdev_ftl_load", 00:06:08.308 "bdev_ftl_create", 00:06:08.308 "bdev_aio_delete", 00:06:08.308 "bdev_aio_rescan", 00:06:08.308 "bdev_aio_create", 00:06:08.308 "blobfs_create", 00:06:08.308 "blobfs_detect", 00:06:08.308 "blobfs_set_cache_size", 00:06:08.308 "bdev_zone_block_delete", 00:06:08.308 "bdev_zone_block_create", 00:06:08.308 "bdev_delay_delete", 00:06:08.308 "bdev_delay_create", 00:06:08.308 "bdev_delay_update_latency", 00:06:08.308 "bdev_split_delete", 00:06:08.308 "bdev_split_create", 00:06:08.308 "bdev_error_inject_error", 00:06:08.308 "bdev_error_delete", 00:06:08.308 "bdev_error_create", 00:06:08.308 "bdev_raid_set_options", 00:06:08.308 "bdev_raid_remove_base_bdev", 00:06:08.308 "bdev_raid_add_base_bdev", 00:06:08.308 "bdev_raid_delete", 00:06:08.308 "bdev_raid_create", 00:06:08.308 "bdev_raid_get_bdevs", 00:06:08.308 "bdev_lvol_grow_lvstore", 00:06:08.308 "bdev_lvol_get_lvols", 00:06:08.308 "bdev_lvol_get_lvstores", 00:06:08.308 "bdev_lvol_delete", 00:06:08.308 "bdev_lvol_set_read_only", 00:06:08.308 "bdev_lvol_resize", 00:06:08.308 "bdev_lvol_decouple_parent", 00:06:08.308 "bdev_lvol_inflate", 00:06:08.308 "bdev_lvol_rename", 00:06:08.308 "bdev_lvol_clone_bdev", 00:06:08.308 "bdev_lvol_clone", 00:06:08.308 "bdev_lvol_snapshot", 00:06:08.309 "bdev_lvol_create", 00:06:08.309 "bdev_lvol_delete_lvstore", 00:06:08.309 "bdev_lvol_rename_lvstore", 00:06:08.309 "bdev_lvol_create_lvstore", 00:06:08.309 "bdev_passthru_delete", 00:06:08.309 "bdev_passthru_create", 00:06:08.309 "bdev_nvme_cuse_unregister", 00:06:08.309 "bdev_nvme_cuse_register", 00:06:08.309 "bdev_opal_new_user", 00:06:08.309 "bdev_opal_set_lock_state", 00:06:08.309 "bdev_opal_delete", 00:06:08.309 "bdev_opal_get_info", 00:06:08.309 "bdev_opal_create", 00:06:08.309 "bdev_nvme_opal_revert", 00:06:08.309 "bdev_nvme_opal_init", 00:06:08.309 "bdev_nvme_send_cmd", 00:06:08.309 "bdev_nvme_get_path_iostat", 00:06:08.309 "bdev_nvme_get_mdns_discovery_info", 00:06:08.309 "bdev_nvme_stop_mdns_discovery", 00:06:08.309 "bdev_nvme_start_mdns_discovery", 00:06:08.309 "bdev_nvme_set_multipath_policy", 00:06:08.309 "bdev_nvme_set_preferred_path", 00:06:08.309 "bdev_nvme_get_io_paths", 00:06:08.309 "bdev_nvme_remove_error_injection", 00:06:08.309 "bdev_nvme_add_error_injection", 00:06:08.309 "bdev_nvme_get_discovery_info", 00:06:08.309 "bdev_nvme_stop_discovery", 00:06:08.309 "bdev_nvme_start_discovery", 00:06:08.309 "bdev_nvme_get_controller_health_info", 00:06:08.309 "bdev_nvme_disable_controller", 00:06:08.309 "bdev_nvme_enable_controller", 00:06:08.309 "bdev_nvme_reset_controller", 00:06:08.309 "bdev_nvme_get_transport_statistics", 00:06:08.309 "bdev_nvme_apply_firmware", 00:06:08.309 "bdev_nvme_detach_controller", 00:06:08.309 "bdev_nvme_get_controllers", 00:06:08.309 "bdev_nvme_attach_controller", 00:06:08.309 "bdev_nvme_set_hotplug", 00:06:08.309 "bdev_nvme_set_options", 00:06:08.309 "bdev_null_resize", 00:06:08.309 "bdev_null_delete", 00:06:08.309 "bdev_null_create", 00:06:08.309 "bdev_malloc_delete", 00:06:08.309 "bdev_malloc_create" 00:06:08.309 ] 00:06:08.309 10:53:06 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:08.309 10:53:06 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:08.309 10:53:06 -- common/autotest_common.sh@10 -- # set +x 00:06:08.309 10:53:06 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:08.309 10:53:06 -- spdkcli/tcp.sh@38 -- # killprocess 629515 00:06:08.309 10:53:06 -- common/autotest_common.sh@936 -- # '[' -z 629515 ']' 00:06:08.309 10:53:06 -- common/autotest_common.sh@940 -- # kill -0 629515 00:06:08.309 10:53:06 -- common/autotest_common.sh@941 -- # uname 00:06:08.309 10:53:06 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:08.309 10:53:06 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 629515 00:06:08.309 10:53:06 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:08.309 10:53:06 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:08.309 10:53:06 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 629515' 00:06:08.309 killing process with pid 629515 00:06:08.309 10:53:06 -- common/autotest_common.sh@955 -- # kill 629515 00:06:08.309 10:53:06 -- common/autotest_common.sh@960 -- # wait 629515 00:06:08.879 00:06:08.879 real 0m1.624s 00:06:08.879 user 0m2.999s 00:06:08.879 sys 0m0.498s 00:06:08.879 10:53:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.879 10:53:07 -- common/autotest_common.sh@10 -- # set +x 00:06:08.879 ************************************ 00:06:08.879 END TEST spdkcli_tcp 00:06:08.879 ************************************ 00:06:08.879 10:53:07 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:08.879 10:53:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:08.879 10:53:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.879 10:53:07 -- common/autotest_common.sh@10 -- # set +x 00:06:08.879 ************************************ 00:06:08.879 START TEST dpdk_mem_utility 00:06:08.879 ************************************ 00:06:08.879 10:53:07 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:08.879 * Looking for test storage... 00:06:08.879 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:08.879 10:53:07 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:08.879 10:53:07 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:08.879 10:53:07 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:08.879 10:53:07 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:08.879 10:53:07 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:08.879 10:53:07 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:08.879 10:53:07 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:08.879 10:53:07 -- scripts/common.sh@335 -- # IFS=.-: 00:06:08.879 10:53:07 -- scripts/common.sh@335 -- # read -ra ver1 00:06:08.879 10:53:07 -- scripts/common.sh@336 -- # IFS=.-: 00:06:08.879 10:53:07 -- scripts/common.sh@336 -- # read -ra ver2 00:06:08.879 10:53:07 -- scripts/common.sh@337 -- # local 'op=<' 00:06:08.879 10:53:07 -- scripts/common.sh@339 -- # ver1_l=2 00:06:08.879 10:53:07 -- scripts/common.sh@340 -- # ver2_l=1 00:06:08.879 10:53:07 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:08.879 10:53:07 -- scripts/common.sh@343 -- # case "$op" in 00:06:08.879 10:53:07 -- scripts/common.sh@344 -- # : 1 00:06:08.879 10:53:07 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:08.879 10:53:07 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:08.879 10:53:07 -- scripts/common.sh@364 -- # decimal 1 00:06:08.879 10:53:07 -- scripts/common.sh@352 -- # local d=1 00:06:08.879 10:53:07 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:08.879 10:53:07 -- scripts/common.sh@354 -- # echo 1 00:06:08.879 10:53:07 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:08.879 10:53:07 -- scripts/common.sh@365 -- # decimal 2 00:06:08.879 10:53:07 -- scripts/common.sh@352 -- # local d=2 00:06:08.879 10:53:07 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:08.879 10:53:07 -- scripts/common.sh@354 -- # echo 2 00:06:08.879 10:53:07 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:08.879 10:53:07 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:08.879 10:53:07 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:08.879 10:53:07 -- scripts/common.sh@367 -- # return 0 00:06:08.879 10:53:07 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:08.879 10:53:07 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:08.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.879 --rc genhtml_branch_coverage=1 00:06:08.879 --rc genhtml_function_coverage=1 00:06:08.879 --rc genhtml_legend=1 00:06:08.879 --rc geninfo_all_blocks=1 00:06:08.879 --rc geninfo_unexecuted_blocks=1 00:06:08.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:08.879 ' 00:06:08.879 10:53:07 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:08.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.879 --rc genhtml_branch_coverage=1 00:06:08.879 --rc genhtml_function_coverage=1 00:06:08.879 --rc genhtml_legend=1 00:06:08.879 --rc geninfo_all_blocks=1 00:06:08.879 --rc geninfo_unexecuted_blocks=1 00:06:08.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:08.879 ' 00:06:08.879 10:53:07 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:08.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.879 --rc genhtml_branch_coverage=1 00:06:08.879 --rc genhtml_function_coverage=1 00:06:08.879 --rc genhtml_legend=1 00:06:08.879 --rc geninfo_all_blocks=1 00:06:08.879 --rc geninfo_unexecuted_blocks=1 00:06:08.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:08.879 ' 00:06:08.879 10:53:07 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:08.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.879 --rc genhtml_branch_coverage=1 00:06:08.879 --rc genhtml_function_coverage=1 00:06:08.879 --rc genhtml_legend=1 00:06:08.879 --rc geninfo_all_blocks=1 00:06:08.879 --rc geninfo_unexecuted_blocks=1 00:06:08.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:08.879 ' 00:06:08.879 10:53:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:08.879 10:53:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=629856 00:06:08.879 10:53:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 629856 00:06:08.879 10:53:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:08.879 10:53:07 -- common/autotest_common.sh@829 -- # '[' -z 629856 ']' 00:06:08.879 10:53:07 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.879 10:53:07 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:08.879 10:53:07 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.879 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.879 10:53:07 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:08.879 10:53:07 -- common/autotest_common.sh@10 -- # set +x 00:06:08.879 [2024-12-16 10:53:07.464011] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:08.879 [2024-12-16 10:53:07.464079] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629856 ] 00:06:08.879 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.139 [2024-12-16 10:53:07.531263] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.139 [2024-12-16 10:53:07.567060] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:09.139 [2024-12-16 10:53:07.567178] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.707 10:53:08 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.707 10:53:08 -- common/autotest_common.sh@862 -- # return 0 00:06:09.707 10:53:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:09.707 10:53:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:09.707 10:53:08 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:09.707 10:53:08 -- common/autotest_common.sh@10 -- # set +x 00:06:09.707 { 00:06:09.707 "filename": "/tmp/spdk_mem_dump.txt" 00:06:09.707 } 00:06:09.707 10:53:08 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:09.707 10:53:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:09.967 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:09.967 1 heaps totaling size 814.000000 MiB 00:06:09.967 size: 814.000000 MiB heap id: 0 00:06:09.967 end heaps---------- 00:06:09.967 8 mempools totaling size 598.116089 MiB 00:06:09.967 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:09.967 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:09.967 size: 84.521057 MiB name: bdev_io_629856 00:06:09.967 size: 51.011292 MiB name: evtpool_629856 00:06:09.967 size: 50.003479 MiB name: msgpool_629856 00:06:09.967 size: 21.763794 MiB name: PDU_Pool 00:06:09.967 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:09.967 size: 0.026123 MiB name: Session_Pool 00:06:09.967 end mempools------- 00:06:09.967 6 memzones totaling size 4.142822 MiB 00:06:09.967 size: 1.000366 MiB name: RG_ring_0_629856 00:06:09.967 size: 1.000366 MiB name: RG_ring_1_629856 00:06:09.967 size: 1.000366 MiB name: RG_ring_4_629856 00:06:09.967 size: 1.000366 MiB name: RG_ring_5_629856 00:06:09.967 size: 0.125366 MiB name: RG_ring_2_629856 00:06:09.967 size: 0.015991 MiB name: RG_ring_3_629856 00:06:09.967 end memzones------- 00:06:09.967 10:53:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:09.967 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:06:09.967 list of free elements. size: 12.519348 MiB 00:06:09.967 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:09.967 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:09.967 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:09.967 element at address: 0x200003e00000 with size: 0.996277 MiB 00:06:09.967 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:09.967 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:09.967 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:09.967 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:09.967 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:09.967 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:06:09.967 element at address: 0x20000b200000 with size: 0.490723 MiB 00:06:09.967 element at address: 0x200000800000 with size: 0.487793 MiB 00:06:09.967 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:09.967 element at address: 0x200027e00000 with size: 0.410034 MiB 00:06:09.967 element at address: 0x200003a00000 with size: 0.355530 MiB 00:06:09.967 list of standard malloc elements. size: 199.218079 MiB 00:06:09.967 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:09.967 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:09.967 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:09.967 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:09.967 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:09.967 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:09.967 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:09.967 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:09.967 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:09.967 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:09.967 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:09.967 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:09.967 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:09.967 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:09.967 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:09.967 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:09.967 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:09.968 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:09.968 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:09.968 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:09.968 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:09.968 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:09.968 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:09.968 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:06:09.968 element at address: 0x200027e69040 with size: 0.000183 MiB 00:06:09.968 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:06:09.968 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:09.968 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:09.968 list of memzone associated elements. size: 602.262573 MiB 00:06:09.968 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:09.968 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:09.968 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:09.968 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:09.968 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:09.968 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_629856_0 00:06:09.968 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:09.968 associated memzone info: size: 48.002930 MiB name: MP_evtpool_629856_0 00:06:09.968 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:09.968 associated memzone info: size: 48.002930 MiB name: MP_msgpool_629856_0 00:06:09.968 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:09.968 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:09.968 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:09.968 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:09.968 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:09.968 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_629856 00:06:09.968 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:09.968 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_629856 00:06:09.968 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:09.968 associated memzone info: size: 1.007996 MiB name: MP_evtpool_629856 00:06:09.968 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:09.968 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:09.968 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:09.968 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:09.968 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:09.968 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:09.968 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:09.968 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:09.968 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:09.968 associated memzone info: size: 1.000366 MiB name: RG_ring_0_629856 00:06:09.968 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:09.968 associated memzone info: size: 1.000366 MiB name: RG_ring_1_629856 00:06:09.968 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:09.968 associated memzone info: size: 1.000366 MiB name: RG_ring_4_629856 00:06:09.968 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:09.968 associated memzone info: size: 1.000366 MiB name: RG_ring_5_629856 00:06:09.968 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:09.968 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_629856 00:06:09.968 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:09.968 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:09.968 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:09.968 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:09.968 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:09.968 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:09.968 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:09.968 associated memzone info: size: 0.125366 MiB name: RG_ring_2_629856 00:06:09.968 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:09.968 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:09.968 element at address: 0x200027e69100 with size: 0.023743 MiB 00:06:09.968 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:09.968 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:09.968 associated memzone info: size: 0.015991 MiB name: RG_ring_3_629856 00:06:09.968 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:06:09.968 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:09.968 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:09.968 associated memzone info: size: 0.000183 MiB name: MP_msgpool_629856 00:06:09.968 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:09.968 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_629856 00:06:09.968 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:06:09.968 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:09.968 10:53:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:09.968 10:53:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 629856 00:06:09.968 10:53:08 -- common/autotest_common.sh@936 -- # '[' -z 629856 ']' 00:06:09.968 10:53:08 -- common/autotest_common.sh@940 -- # kill -0 629856 00:06:09.968 10:53:08 -- common/autotest_common.sh@941 -- # uname 00:06:09.968 10:53:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:09.968 10:53:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 629856 00:06:09.968 10:53:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:09.968 10:53:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:09.968 10:53:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 629856' 00:06:09.968 killing process with pid 629856 00:06:09.968 10:53:08 -- common/autotest_common.sh@955 -- # kill 629856 00:06:09.968 10:53:08 -- common/autotest_common.sh@960 -- # wait 629856 00:06:10.228 00:06:10.228 real 0m1.522s 00:06:10.228 user 0m1.569s 00:06:10.228 sys 0m0.471s 00:06:10.228 10:53:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:10.228 10:53:08 -- common/autotest_common.sh@10 -- # set +x 00:06:10.228 ************************************ 00:06:10.228 END TEST dpdk_mem_utility 00:06:10.228 ************************************ 00:06:10.228 10:53:08 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:10.228 10:53:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:10.228 10:53:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.228 10:53:08 -- common/autotest_common.sh@10 -- # set +x 00:06:10.228 ************************************ 00:06:10.228 START TEST event 00:06:10.228 ************************************ 00:06:10.228 10:53:08 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:10.488 * Looking for test storage... 00:06:10.488 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:10.488 10:53:08 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:10.488 10:53:08 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:10.488 10:53:08 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:10.488 10:53:08 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:10.488 10:53:08 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:10.488 10:53:08 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:10.488 10:53:08 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:10.488 10:53:08 -- scripts/common.sh@335 -- # IFS=.-: 00:06:10.488 10:53:08 -- scripts/common.sh@335 -- # read -ra ver1 00:06:10.488 10:53:08 -- scripts/common.sh@336 -- # IFS=.-: 00:06:10.488 10:53:08 -- scripts/common.sh@336 -- # read -ra ver2 00:06:10.488 10:53:08 -- scripts/common.sh@337 -- # local 'op=<' 00:06:10.488 10:53:08 -- scripts/common.sh@339 -- # ver1_l=2 00:06:10.488 10:53:08 -- scripts/common.sh@340 -- # ver2_l=1 00:06:10.488 10:53:08 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:10.488 10:53:08 -- scripts/common.sh@343 -- # case "$op" in 00:06:10.488 10:53:08 -- scripts/common.sh@344 -- # : 1 00:06:10.488 10:53:08 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:10.488 10:53:08 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:10.488 10:53:08 -- scripts/common.sh@364 -- # decimal 1 00:06:10.488 10:53:08 -- scripts/common.sh@352 -- # local d=1 00:06:10.488 10:53:08 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:10.488 10:53:08 -- scripts/common.sh@354 -- # echo 1 00:06:10.488 10:53:08 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:10.488 10:53:08 -- scripts/common.sh@365 -- # decimal 2 00:06:10.488 10:53:08 -- scripts/common.sh@352 -- # local d=2 00:06:10.488 10:53:09 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:10.488 10:53:09 -- scripts/common.sh@354 -- # echo 2 00:06:10.488 10:53:09 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:10.488 10:53:09 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:10.488 10:53:09 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:10.488 10:53:09 -- scripts/common.sh@367 -- # return 0 00:06:10.488 10:53:09 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:10.488 10:53:09 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:10.488 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.488 --rc genhtml_branch_coverage=1 00:06:10.488 --rc genhtml_function_coverage=1 00:06:10.488 --rc genhtml_legend=1 00:06:10.488 --rc geninfo_all_blocks=1 00:06:10.488 --rc geninfo_unexecuted_blocks=1 00:06:10.488 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.488 ' 00:06:10.488 10:53:09 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:10.488 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.488 --rc genhtml_branch_coverage=1 00:06:10.488 --rc genhtml_function_coverage=1 00:06:10.488 --rc genhtml_legend=1 00:06:10.488 --rc geninfo_all_blocks=1 00:06:10.488 --rc geninfo_unexecuted_blocks=1 00:06:10.488 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.488 ' 00:06:10.488 10:53:09 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:10.488 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.488 --rc genhtml_branch_coverage=1 00:06:10.488 --rc genhtml_function_coverage=1 00:06:10.488 --rc genhtml_legend=1 00:06:10.488 --rc geninfo_all_blocks=1 00:06:10.488 --rc geninfo_unexecuted_blocks=1 00:06:10.488 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.488 ' 00:06:10.488 10:53:09 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:10.488 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:10.488 --rc genhtml_branch_coverage=1 00:06:10.488 --rc genhtml_function_coverage=1 00:06:10.488 --rc genhtml_legend=1 00:06:10.488 --rc geninfo_all_blocks=1 00:06:10.488 --rc geninfo_unexecuted_blocks=1 00:06:10.488 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:10.488 ' 00:06:10.488 10:53:09 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:10.488 10:53:09 -- bdev/nbd_common.sh@6 -- # set -e 00:06:10.488 10:53:09 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:10.488 10:53:09 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:10.488 10:53:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.488 10:53:09 -- common/autotest_common.sh@10 -- # set +x 00:06:10.488 ************************************ 00:06:10.488 START TEST event_perf 00:06:10.488 ************************************ 00:06:10.488 10:53:09 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:10.488 Running I/O for 1 seconds...[2024-12-16 10:53:09.027081] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:10.488 [2024-12-16 10:53:09.027165] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630196 ] 00:06:10.488 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.488 [2024-12-16 10:53:09.096152] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:10.748 [2024-12-16 10:53:09.135291] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:10.748 [2024-12-16 10:53:09.135387] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:10.748 [2024-12-16 10:53:09.135471] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:10.748 [2024-12-16 10:53:09.135473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.686 Running I/O for 1 seconds... 00:06:11.686 lcore 0: 202487 00:06:11.686 lcore 1: 202486 00:06:11.686 lcore 2: 202487 00:06:11.686 lcore 3: 202487 00:06:11.686 done. 00:06:11.686 00:06:11.686 real 0m1.182s 00:06:11.686 user 0m4.093s 00:06:11.686 sys 0m0.086s 00:06:11.686 10:53:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:11.686 10:53:10 -- common/autotest_common.sh@10 -- # set +x 00:06:11.686 ************************************ 00:06:11.686 END TEST event_perf 00:06:11.686 ************************************ 00:06:11.686 10:53:10 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:11.686 10:53:10 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:11.686 10:53:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:11.686 10:53:10 -- common/autotest_common.sh@10 -- # set +x 00:06:11.686 ************************************ 00:06:11.686 START TEST event_reactor 00:06:11.686 ************************************ 00:06:11.686 10:53:10 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:11.686 [2024-12-16 10:53:10.255499] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:11.686 [2024-12-16 10:53:10.255586] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630483 ] 00:06:11.686 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.946 [2024-12-16 10:53:10.324822] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.946 [2024-12-16 10:53:10.359885] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.883 test_start 00:06:12.883 oneshot 00:06:12.883 tick 100 00:06:12.883 tick 100 00:06:12.883 tick 250 00:06:12.883 tick 100 00:06:12.883 tick 100 00:06:12.883 tick 250 00:06:12.883 tick 500 00:06:12.883 tick 100 00:06:12.883 tick 100 00:06:12.883 tick 100 00:06:12.883 tick 250 00:06:12.883 tick 100 00:06:12.883 tick 100 00:06:12.883 test_end 00:06:12.883 00:06:12.883 real 0m1.177s 00:06:12.883 user 0m1.089s 00:06:12.883 sys 0m0.083s 00:06:12.883 10:53:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:12.883 10:53:11 -- common/autotest_common.sh@10 -- # set +x 00:06:12.883 ************************************ 00:06:12.883 END TEST event_reactor 00:06:12.883 ************************************ 00:06:12.883 10:53:11 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:12.883 10:53:11 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:12.884 10:53:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.884 10:53:11 -- common/autotest_common.sh@10 -- # set +x 00:06:12.884 ************************************ 00:06:12.884 START TEST event_reactor_perf 00:06:12.884 ************************************ 00:06:12.884 10:53:11 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:12.884 [2024-12-16 10:53:11.466838] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:12.884 [2024-12-16 10:53:11.466900] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630765 ] 00:06:12.884 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.143 [2024-12-16 10:53:11.530940] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.143 [2024-12-16 10:53:11.565208] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.081 test_start 00:06:14.081 test_end 00:06:14.081 Performance: 965623 events per second 00:06:14.081 00:06:14.081 real 0m1.163s 00:06:14.081 user 0m1.079s 00:06:14.081 sys 0m0.081s 00:06:14.081 10:53:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.081 10:53:12 -- common/autotest_common.sh@10 -- # set +x 00:06:14.081 ************************************ 00:06:14.081 END TEST event_reactor_perf 00:06:14.081 ************************************ 00:06:14.081 10:53:12 -- event/event.sh@49 -- # uname -s 00:06:14.081 10:53:12 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:14.081 10:53:12 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:14.081 10:53:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.081 10:53:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.081 10:53:12 -- common/autotest_common.sh@10 -- # set +x 00:06:14.081 ************************************ 00:06:14.081 START TEST event_scheduler 00:06:14.081 ************************************ 00:06:14.081 10:53:12 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:14.341 * Looking for test storage... 00:06:14.341 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:14.341 10:53:12 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:14.341 10:53:12 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:14.341 10:53:12 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:14.341 10:53:12 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:14.341 10:53:12 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:14.341 10:53:12 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:14.341 10:53:12 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:14.341 10:53:12 -- scripts/common.sh@335 -- # IFS=.-: 00:06:14.341 10:53:12 -- scripts/common.sh@335 -- # read -ra ver1 00:06:14.341 10:53:12 -- scripts/common.sh@336 -- # IFS=.-: 00:06:14.341 10:53:12 -- scripts/common.sh@336 -- # read -ra ver2 00:06:14.341 10:53:12 -- scripts/common.sh@337 -- # local 'op=<' 00:06:14.341 10:53:12 -- scripts/common.sh@339 -- # ver1_l=2 00:06:14.341 10:53:12 -- scripts/common.sh@340 -- # ver2_l=1 00:06:14.341 10:53:12 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:14.341 10:53:12 -- scripts/common.sh@343 -- # case "$op" in 00:06:14.341 10:53:12 -- scripts/common.sh@344 -- # : 1 00:06:14.341 10:53:12 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:14.341 10:53:12 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:14.341 10:53:12 -- scripts/common.sh@364 -- # decimal 1 00:06:14.341 10:53:12 -- scripts/common.sh@352 -- # local d=1 00:06:14.341 10:53:12 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:14.341 10:53:12 -- scripts/common.sh@354 -- # echo 1 00:06:14.341 10:53:12 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:14.341 10:53:12 -- scripts/common.sh@365 -- # decimal 2 00:06:14.341 10:53:12 -- scripts/common.sh@352 -- # local d=2 00:06:14.341 10:53:12 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:14.341 10:53:12 -- scripts/common.sh@354 -- # echo 2 00:06:14.341 10:53:12 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:14.341 10:53:12 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:14.341 10:53:12 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:14.341 10:53:12 -- scripts/common.sh@367 -- # return 0 00:06:14.341 10:53:12 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:14.341 10:53:12 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:14.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.341 --rc genhtml_branch_coverage=1 00:06:14.341 --rc genhtml_function_coverage=1 00:06:14.341 --rc genhtml_legend=1 00:06:14.341 --rc geninfo_all_blocks=1 00:06:14.341 --rc geninfo_unexecuted_blocks=1 00:06:14.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.341 ' 00:06:14.341 10:53:12 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:14.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.341 --rc genhtml_branch_coverage=1 00:06:14.341 --rc genhtml_function_coverage=1 00:06:14.341 --rc genhtml_legend=1 00:06:14.341 --rc geninfo_all_blocks=1 00:06:14.341 --rc geninfo_unexecuted_blocks=1 00:06:14.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.341 ' 00:06:14.341 10:53:12 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:14.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.341 --rc genhtml_branch_coverage=1 00:06:14.341 --rc genhtml_function_coverage=1 00:06:14.341 --rc genhtml_legend=1 00:06:14.341 --rc geninfo_all_blocks=1 00:06:14.341 --rc geninfo_unexecuted_blocks=1 00:06:14.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.341 ' 00:06:14.341 10:53:12 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:14.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.341 --rc genhtml_branch_coverage=1 00:06:14.341 --rc genhtml_function_coverage=1 00:06:14.341 --rc genhtml_legend=1 00:06:14.341 --rc geninfo_all_blocks=1 00:06:14.341 --rc geninfo_unexecuted_blocks=1 00:06:14.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.341 ' 00:06:14.341 10:53:12 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:14.341 10:53:12 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:14.341 10:53:12 -- scheduler/scheduler.sh@35 -- # scheduler_pid=631085 00:06:14.341 10:53:12 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:14.341 10:53:12 -- scheduler/scheduler.sh@37 -- # waitforlisten 631085 00:06:14.341 10:53:12 -- common/autotest_common.sh@829 -- # '[' -z 631085 ']' 00:06:14.341 10:53:12 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.341 10:53:12 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.341 10:53:12 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.341 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.341 10:53:12 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.341 10:53:12 -- common/autotest_common.sh@10 -- # set +x 00:06:14.341 [2024-12-16 10:53:12.876378] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.341 [2024-12-16 10:53:12.876430] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631085 ] 00:06:14.341 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.341 [2024-12-16 10:53:12.933564] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:14.601 [2024-12-16 10:53:12.971086] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.601 [2024-12-16 10:53:12.971172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.601 [2024-12-16 10:53:12.971256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:14.601 [2024-12-16 10:53:12.971257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:14.601 10:53:13 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:14.601 10:53:13 -- common/autotest_common.sh@862 -- # return 0 00:06:14.601 10:53:13 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:14.601 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.601 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.601 POWER: Env isn't set yet! 00:06:14.601 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:14.601 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:14.601 POWER: Cannot set governor of lcore 0 to userspace 00:06:14.601 POWER: Attempting to initialise PSTAT power management... 00:06:14.601 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:14.601 POWER: Initialized successfully for lcore 0 power management 00:06:14.601 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:14.601 POWER: Initialized successfully for lcore 1 power management 00:06:14.601 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:14.601 POWER: Initialized successfully for lcore 2 power management 00:06:14.601 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:14.601 POWER: Initialized successfully for lcore 3 power management 00:06:14.601 [2024-12-16 10:53:13.100794] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:14.601 [2024-12-16 10:53:13.100810] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:14.601 [2024-12-16 10:53:13.100820] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:14.601 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.601 10:53:13 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:14.601 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.601 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.601 [2024-12-16 10:53:13.162869] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:14.601 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.601 10:53:13 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:14.601 10:53:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.601 10:53:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.601 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.601 ************************************ 00:06:14.601 START TEST scheduler_create_thread 00:06:14.601 ************************************ 00:06:14.601 10:53:13 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:06:14.601 10:53:13 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:14.601 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.601 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.601 2 00:06:14.602 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.602 10:53:13 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:14.602 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.602 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.602 3 00:06:14.602 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.602 10:53:13 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:14.602 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.602 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.602 4 00:06:14.602 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.602 10:53:13 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:14.602 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.602 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 5 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 6 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 7 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 8 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 9 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 10 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:14.861 10:53:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:14.861 10:53:13 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:14.861 10:53:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:14.861 10:53:13 -- common/autotest_common.sh@10 -- # set +x 00:06:15.799 10:53:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.799 10:53:14 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:15.799 10:53:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.799 10:53:14 -- common/autotest_common.sh@10 -- # set +x 00:06:17.177 10:53:15 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:17.177 10:53:15 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:17.177 10:53:15 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:17.177 10:53:15 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:17.177 10:53:15 -- common/autotest_common.sh@10 -- # set +x 00:06:18.116 10:53:16 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:18.116 00:06:18.116 real 0m3.382s 00:06:18.116 user 0m0.024s 00:06:18.116 sys 0m0.008s 00:06:18.116 10:53:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.116 10:53:16 -- common/autotest_common.sh@10 -- # set +x 00:06:18.116 ************************************ 00:06:18.116 END TEST scheduler_create_thread 00:06:18.116 ************************************ 00:06:18.116 10:53:16 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:18.116 10:53:16 -- scheduler/scheduler.sh@46 -- # killprocess 631085 00:06:18.116 10:53:16 -- common/autotest_common.sh@936 -- # '[' -z 631085 ']' 00:06:18.116 10:53:16 -- common/autotest_common.sh@940 -- # kill -0 631085 00:06:18.116 10:53:16 -- common/autotest_common.sh@941 -- # uname 00:06:18.116 10:53:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.116 10:53:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 631085 00:06:18.116 10:53:16 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:18.116 10:53:16 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:18.116 10:53:16 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 631085' 00:06:18.116 killing process with pid 631085 00:06:18.116 10:53:16 -- common/autotest_common.sh@955 -- # kill 631085 00:06:18.116 10:53:16 -- common/autotest_common.sh@960 -- # wait 631085 00:06:18.375 [2024-12-16 10:53:16.934585] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:18.634 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:18.634 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:18.634 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:18.634 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:18.634 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:18.634 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:18.634 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:18.634 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:18.634 00:06:18.634 real 0m4.471s 00:06:18.634 user 0m7.942s 00:06:18.634 sys 0m0.390s 00:06:18.634 10:53:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.634 10:53:17 -- common/autotest_common.sh@10 -- # set +x 00:06:18.634 ************************************ 00:06:18.634 END TEST event_scheduler 00:06:18.634 ************************************ 00:06:18.634 10:53:17 -- event/event.sh@51 -- # modprobe -n nbd 00:06:18.634 10:53:17 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:18.634 10:53:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.634 10:53:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.634 10:53:17 -- common/autotest_common.sh@10 -- # set +x 00:06:18.634 ************************************ 00:06:18.634 START TEST app_repeat 00:06:18.634 ************************************ 00:06:18.634 10:53:17 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:06:18.634 10:53:17 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.634 10:53:17 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:18.634 10:53:17 -- event/event.sh@13 -- # local nbd_list 00:06:18.634 10:53:17 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:18.634 10:53:17 -- event/event.sh@14 -- # local bdev_list 00:06:18.634 10:53:17 -- event/event.sh@15 -- # local repeat_times=4 00:06:18.634 10:53:17 -- event/event.sh@17 -- # modprobe nbd 00:06:18.634 10:53:17 -- event/event.sh@19 -- # repeat_pid=631811 00:06:18.634 10:53:17 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:18.634 10:53:17 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 631811' 00:06:18.634 Process app_repeat pid: 631811 00:06:18.634 10:53:17 -- event/event.sh@23 -- # for i in {0..2} 00:06:18.634 10:53:17 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:18.634 spdk_app_start Round 0 00:06:18.634 10:53:17 -- event/event.sh@25 -- # waitforlisten 631811 /var/tmp/spdk-nbd.sock 00:06:18.634 10:53:17 -- common/autotest_common.sh@829 -- # '[' -z 631811 ']' 00:06:18.634 10:53:17 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:18.634 10:53:17 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.634 10:53:17 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:18.634 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:18.634 10:53:17 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.634 10:53:17 -- common/autotest_common.sh@10 -- # set +x 00:06:18.634 10:53:17 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:18.634 [2024-12-16 10:53:17.223227] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:18.634 [2024-12-16 10:53:17.223328] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631811 ] 00:06:18.894 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.894 [2024-12-16 10:53:17.292393] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:18.894 [2024-12-16 10:53:17.330184] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:18.894 [2024-12-16 10:53:17.330187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.462 10:53:18 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:19.462 10:53:18 -- common/autotest_common.sh@862 -- # return 0 00:06:19.462 10:53:18 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:19.721 Malloc0 00:06:19.721 10:53:18 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:19.981 Malloc1 00:06:19.981 10:53:18 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@12 -- # local i 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:19.981 10:53:18 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:20.240 /dev/nbd0 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:20.240 10:53:18 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:20.240 10:53:18 -- common/autotest_common.sh@867 -- # local i 00:06:20.240 10:53:18 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:20.240 10:53:18 -- common/autotest_common.sh@871 -- # break 00:06:20.240 10:53:18 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:20.240 1+0 records in 00:06:20.240 1+0 records out 00:06:20.240 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217929 s, 18.8 MB/s 00:06:20.240 10:53:18 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:20.240 10:53:18 -- common/autotest_common.sh@884 -- # size=4096 00:06:20.240 10:53:18 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:20.240 10:53:18 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:20.240 10:53:18 -- common/autotest_common.sh@887 -- # return 0 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:20.240 /dev/nbd1 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:20.240 10:53:18 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:20.240 10:53:18 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:20.240 10:53:18 -- common/autotest_common.sh@867 -- # local i 00:06:20.240 10:53:18 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:20.240 10:53:18 -- common/autotest_common.sh@871 -- # break 00:06:20.240 10:53:18 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:20.240 10:53:18 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:20.240 1+0 records in 00:06:20.240 1+0 records out 00:06:20.240 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248156 s, 16.5 MB/s 00:06:20.240 10:53:18 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:20.500 10:53:18 -- common/autotest_common.sh@884 -- # size=4096 00:06:20.500 10:53:18 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:20.500 10:53:18 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:20.500 10:53:18 -- common/autotest_common.sh@887 -- # return 0 00:06:20.500 10:53:18 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:20.500 10:53:18 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:20.500 10:53:18 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:20.500 10:53:18 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:20.500 10:53:18 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:20.500 { 00:06:20.500 "nbd_device": "/dev/nbd0", 00:06:20.500 "bdev_name": "Malloc0" 00:06:20.500 }, 00:06:20.500 { 00:06:20.500 "nbd_device": "/dev/nbd1", 00:06:20.500 "bdev_name": "Malloc1" 00:06:20.500 } 00:06:20.500 ]' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:20.500 { 00:06:20.500 "nbd_device": "/dev/nbd0", 00:06:20.500 "bdev_name": "Malloc0" 00:06:20.500 }, 00:06:20.500 { 00:06:20.500 "nbd_device": "/dev/nbd1", 00:06:20.500 "bdev_name": "Malloc1" 00:06:20.500 } 00:06:20.500 ]' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:20.500 /dev/nbd1' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:20.500 /dev/nbd1' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@65 -- # count=2 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@95 -- # count=2 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:20.500 256+0 records in 00:06:20.500 256+0 records out 00:06:20.500 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110768 s, 94.7 MB/s 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:20.500 10:53:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:20.760 256+0 records in 00:06:20.760 256+0 records out 00:06:20.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197628 s, 53.1 MB/s 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:20.760 256+0 records in 00:06:20.760 256+0 records out 00:06:20.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209675 s, 50.0 MB/s 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@51 -- # local i 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.760 10:53:19 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@41 -- # break 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@45 -- # return 0 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@41 -- # break 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@45 -- # return 0 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.020 10:53:19 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@65 -- # true 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@65 -- # count=0 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@104 -- # count=0 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:21.279 10:53:19 -- bdev/nbd_common.sh@109 -- # return 0 00:06:21.279 10:53:19 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:21.538 10:53:19 -- event/event.sh@35 -- # sleep 3 00:06:21.798 [2024-12-16 10:53:20.173038] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:21.798 [2024-12-16 10:53:20.206091] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.798 [2024-12-16 10:53:20.206092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.798 [2024-12-16 10:53:20.246868] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:21.798 [2024-12-16 10:53:20.246915] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:25.088 10:53:23 -- event/event.sh@23 -- # for i in {0..2} 00:06:25.088 10:53:23 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:25.088 spdk_app_start Round 1 00:06:25.088 10:53:23 -- event/event.sh@25 -- # waitforlisten 631811 /var/tmp/spdk-nbd.sock 00:06:25.088 10:53:23 -- common/autotest_common.sh@829 -- # '[' -z 631811 ']' 00:06:25.088 10:53:23 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:25.088 10:53:23 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.088 10:53:23 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:25.088 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:25.088 10:53:23 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.088 10:53:23 -- common/autotest_common.sh@10 -- # set +x 00:06:25.088 10:53:23 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.088 10:53:23 -- common/autotest_common.sh@862 -- # return 0 00:06:25.088 10:53:23 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:25.088 Malloc0 00:06:25.088 10:53:23 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:25.088 Malloc1 00:06:25.088 10:53:23 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@12 -- # local i 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:25.088 10:53:23 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:25.348 /dev/nbd0 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:25.348 10:53:23 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:25.348 10:53:23 -- common/autotest_common.sh@867 -- # local i 00:06:25.348 10:53:23 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:25.348 10:53:23 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:25.348 10:53:23 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:25.348 10:53:23 -- common/autotest_common.sh@871 -- # break 00:06:25.348 10:53:23 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:25.348 10:53:23 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:25.348 10:53:23 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:25.348 1+0 records in 00:06:25.348 1+0 records out 00:06:25.348 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225892 s, 18.1 MB/s 00:06:25.348 10:53:23 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:25.348 10:53:23 -- common/autotest_common.sh@884 -- # size=4096 00:06:25.348 10:53:23 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:25.348 10:53:23 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:25.348 10:53:23 -- common/autotest_common.sh@887 -- # return 0 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:25.348 /dev/nbd1 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:25.348 10:53:23 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:25.348 10:53:23 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:25.348 10:53:23 -- common/autotest_common.sh@867 -- # local i 00:06:25.348 10:53:23 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:25.348 10:53:23 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:25.348 10:53:23 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:25.607 10:53:23 -- common/autotest_common.sh@871 -- # break 00:06:25.607 10:53:23 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:25.607 10:53:23 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:25.607 10:53:23 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:25.607 1+0 records in 00:06:25.607 1+0 records out 00:06:25.607 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225114 s, 18.2 MB/s 00:06:25.607 10:53:23 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:25.607 10:53:23 -- common/autotest_common.sh@884 -- # size=4096 00:06:25.607 10:53:23 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:25.607 10:53:23 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:25.607 10:53:23 -- common/autotest_common.sh@887 -- # return 0 00:06:25.607 10:53:23 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:25.607 10:53:23 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:25.607 10:53:23 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:25.607 10:53:23 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.607 10:53:23 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:25.607 { 00:06:25.607 "nbd_device": "/dev/nbd0", 00:06:25.607 "bdev_name": "Malloc0" 00:06:25.607 }, 00:06:25.607 { 00:06:25.607 "nbd_device": "/dev/nbd1", 00:06:25.607 "bdev_name": "Malloc1" 00:06:25.607 } 00:06:25.607 ]' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:25.607 { 00:06:25.607 "nbd_device": "/dev/nbd0", 00:06:25.607 "bdev_name": "Malloc0" 00:06:25.607 }, 00:06:25.607 { 00:06:25.607 "nbd_device": "/dev/nbd1", 00:06:25.607 "bdev_name": "Malloc1" 00:06:25.607 } 00:06:25.607 ]' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:25.607 /dev/nbd1' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:25.607 /dev/nbd1' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@65 -- # count=2 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@95 -- # count=2 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:25.607 256+0 records in 00:06:25.607 256+0 records out 00:06:25.607 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106447 s, 98.5 MB/s 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:25.607 10:53:24 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:25.867 256+0 records in 00:06:25.867 256+0 records out 00:06:25.867 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198611 s, 52.8 MB/s 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:25.867 256+0 records in 00:06:25.867 256+0 records out 00:06:25.867 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208449 s, 50.3 MB/s 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@51 -- # local i 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:25.867 10:53:24 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@41 -- # break 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@45 -- # return 0 00:06:26.127 10:53:24 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@41 -- # break 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@45 -- # return 0 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.128 10:53:24 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@65 -- # true 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@65 -- # count=0 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@104 -- # count=0 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:26.387 10:53:24 -- bdev/nbd_common.sh@109 -- # return 0 00:06:26.387 10:53:24 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:26.653 10:53:25 -- event/event.sh@35 -- # sleep 3 00:06:26.917 [2024-12-16 10:53:25.304844] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:26.917 [2024-12-16 10:53:25.336866] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:26.917 [2024-12-16 10:53:25.336869] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.917 [2024-12-16 10:53:25.377737] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:26.917 [2024-12-16 10:53:25.377778] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:30.209 10:53:28 -- event/event.sh@23 -- # for i in {0..2} 00:06:30.209 10:53:28 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:30.209 spdk_app_start Round 2 00:06:30.209 10:53:28 -- event/event.sh@25 -- # waitforlisten 631811 /var/tmp/spdk-nbd.sock 00:06:30.209 10:53:28 -- common/autotest_common.sh@829 -- # '[' -z 631811 ']' 00:06:30.209 10:53:28 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:30.209 10:53:28 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.209 10:53:28 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:30.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:30.209 10:53:28 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.209 10:53:28 -- common/autotest_common.sh@10 -- # set +x 00:06:30.209 10:53:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.209 10:53:28 -- common/autotest_common.sh@862 -- # return 0 00:06:30.209 10:53:28 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:30.209 Malloc0 00:06:30.209 10:53:28 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:30.209 Malloc1 00:06:30.209 10:53:28 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@12 -- # local i 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.209 10:53:28 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:30.469 /dev/nbd0 00:06:30.469 10:53:28 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:30.469 10:53:28 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:30.469 10:53:28 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:30.469 10:53:28 -- common/autotest_common.sh@867 -- # local i 00:06:30.469 10:53:28 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:30.469 10:53:28 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:30.469 10:53:28 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:30.469 10:53:28 -- common/autotest_common.sh@871 -- # break 00:06:30.469 10:53:28 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:30.469 10:53:28 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:30.469 10:53:28 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:30.469 1+0 records in 00:06:30.469 1+0 records out 00:06:30.469 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021926 s, 18.7 MB/s 00:06:30.469 10:53:28 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.469 10:53:28 -- common/autotest_common.sh@884 -- # size=4096 00:06:30.469 10:53:28 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.469 10:53:28 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:30.469 10:53:28 -- common/autotest_common.sh@887 -- # return 0 00:06:30.469 10:53:28 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.469 10:53:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.469 10:53:28 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:30.469 /dev/nbd1 00:06:30.469 10:53:29 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:30.469 10:53:29 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:30.469 10:53:29 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:30.469 10:53:29 -- common/autotest_common.sh@867 -- # local i 00:06:30.728 10:53:29 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:30.728 10:53:29 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:30.728 10:53:29 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:30.728 10:53:29 -- common/autotest_common.sh@871 -- # break 00:06:30.728 10:53:29 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:30.728 10:53:29 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:30.728 10:53:29 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:30.728 1+0 records in 00:06:30.728 1+0 records out 00:06:30.728 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227187 s, 18.0 MB/s 00:06:30.728 10:53:29 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.728 10:53:29 -- common/autotest_common.sh@884 -- # size=4096 00:06:30.728 10:53:29 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.728 10:53:29 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:30.728 10:53:29 -- common/autotest_common.sh@887 -- # return 0 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:30.728 { 00:06:30.728 "nbd_device": "/dev/nbd0", 00:06:30.728 "bdev_name": "Malloc0" 00:06:30.728 }, 00:06:30.728 { 00:06:30.728 "nbd_device": "/dev/nbd1", 00:06:30.728 "bdev_name": "Malloc1" 00:06:30.728 } 00:06:30.728 ]' 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:30.728 { 00:06:30.728 "nbd_device": "/dev/nbd0", 00:06:30.728 "bdev_name": "Malloc0" 00:06:30.728 }, 00:06:30.728 { 00:06:30.728 "nbd_device": "/dev/nbd1", 00:06:30.728 "bdev_name": "Malloc1" 00:06:30.728 } 00:06:30.728 ]' 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:30.728 /dev/nbd1' 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:30.728 /dev/nbd1' 00:06:30.728 10:53:29 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@65 -- # count=2 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@95 -- # count=2 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:30.988 256+0 records in 00:06:30.988 256+0 records out 00:06:30.988 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106947 s, 98.0 MB/s 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:30.988 256+0 records in 00:06:30.988 256+0 records out 00:06:30.988 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200106 s, 52.4 MB/s 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:30.988 256+0 records in 00:06:30.988 256+0 records out 00:06:30.988 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211325 s, 49.6 MB/s 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@51 -- # local i 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:30.988 10:53:29 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@41 -- # break 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@45 -- # return 0 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@41 -- # break 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@45 -- # return 0 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.248 10:53:29 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@65 -- # true 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@65 -- # count=0 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@104 -- # count=0 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:31.507 10:53:30 -- bdev/nbd_common.sh@109 -- # return 0 00:06:31.507 10:53:30 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:31.767 10:53:30 -- event/event.sh@35 -- # sleep 3 00:06:32.026 [2024-12-16 10:53:30.425401] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:32.026 [2024-12-16 10:53:30.458324] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.026 [2024-12-16 10:53:30.458327] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.026 [2024-12-16 10:53:30.499551] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:32.026 [2024-12-16 10:53:30.499593] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:35.319 10:53:33 -- event/event.sh@38 -- # waitforlisten 631811 /var/tmp/spdk-nbd.sock 00:06:35.319 10:53:33 -- common/autotest_common.sh@829 -- # '[' -z 631811 ']' 00:06:35.319 10:53:33 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:35.319 10:53:33 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:35.319 10:53:33 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:35.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:35.319 10:53:33 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:35.319 10:53:33 -- common/autotest_common.sh@10 -- # set +x 00:06:35.319 10:53:33 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.319 10:53:33 -- common/autotest_common.sh@862 -- # return 0 00:06:35.319 10:53:33 -- event/event.sh@39 -- # killprocess 631811 00:06:35.319 10:53:33 -- common/autotest_common.sh@936 -- # '[' -z 631811 ']' 00:06:35.319 10:53:33 -- common/autotest_common.sh@940 -- # kill -0 631811 00:06:35.319 10:53:33 -- common/autotest_common.sh@941 -- # uname 00:06:35.319 10:53:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:35.319 10:53:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 631811 00:06:35.319 10:53:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:35.319 10:53:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:35.319 10:53:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 631811' 00:06:35.319 killing process with pid 631811 00:06:35.319 10:53:33 -- common/autotest_common.sh@955 -- # kill 631811 00:06:35.319 10:53:33 -- common/autotest_common.sh@960 -- # wait 631811 00:06:35.319 spdk_app_start is called in Round 0. 00:06:35.319 Shutdown signal received, stop current app iteration 00:06:35.319 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:35.319 spdk_app_start is called in Round 1. 00:06:35.319 Shutdown signal received, stop current app iteration 00:06:35.319 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:35.319 spdk_app_start is called in Round 2. 00:06:35.319 Shutdown signal received, stop current app iteration 00:06:35.319 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:35.319 spdk_app_start is called in Round 3. 00:06:35.319 Shutdown signal received, stop current app iteration 00:06:35.319 10:53:33 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:35.319 10:53:33 -- event/event.sh@42 -- # return 0 00:06:35.319 00:06:35.319 real 0m16.449s 00:06:35.319 user 0m35.232s 00:06:35.319 sys 0m3.069s 00:06:35.319 10:53:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.319 10:53:33 -- common/autotest_common.sh@10 -- # set +x 00:06:35.319 ************************************ 00:06:35.319 END TEST app_repeat 00:06:35.319 ************************************ 00:06:35.319 10:53:33 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:35.319 10:53:33 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:35.319 10:53:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:35.319 10:53:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.319 10:53:33 -- common/autotest_common.sh@10 -- # set +x 00:06:35.319 ************************************ 00:06:35.319 START TEST cpu_locks 00:06:35.319 ************************************ 00:06:35.319 10:53:33 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:35.319 * Looking for test storage... 00:06:35.319 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:35.319 10:53:33 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:35.319 10:53:33 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:35.319 10:53:33 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:35.319 10:53:33 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:35.319 10:53:33 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:35.319 10:53:33 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:35.319 10:53:33 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:35.319 10:53:33 -- scripts/common.sh@335 -- # IFS=.-: 00:06:35.319 10:53:33 -- scripts/common.sh@335 -- # read -ra ver1 00:06:35.319 10:53:33 -- scripts/common.sh@336 -- # IFS=.-: 00:06:35.319 10:53:33 -- scripts/common.sh@336 -- # read -ra ver2 00:06:35.319 10:53:33 -- scripts/common.sh@337 -- # local 'op=<' 00:06:35.319 10:53:33 -- scripts/common.sh@339 -- # ver1_l=2 00:06:35.319 10:53:33 -- scripts/common.sh@340 -- # ver2_l=1 00:06:35.319 10:53:33 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:35.319 10:53:33 -- scripts/common.sh@343 -- # case "$op" in 00:06:35.319 10:53:33 -- scripts/common.sh@344 -- # : 1 00:06:35.319 10:53:33 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:35.319 10:53:33 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:35.319 10:53:33 -- scripts/common.sh@364 -- # decimal 1 00:06:35.319 10:53:33 -- scripts/common.sh@352 -- # local d=1 00:06:35.319 10:53:33 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:35.319 10:53:33 -- scripts/common.sh@354 -- # echo 1 00:06:35.319 10:53:33 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:35.319 10:53:33 -- scripts/common.sh@365 -- # decimal 2 00:06:35.319 10:53:33 -- scripts/common.sh@352 -- # local d=2 00:06:35.319 10:53:33 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:35.319 10:53:33 -- scripts/common.sh@354 -- # echo 2 00:06:35.319 10:53:33 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:35.319 10:53:33 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:35.319 10:53:33 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:35.319 10:53:33 -- scripts/common.sh@367 -- # return 0 00:06:35.319 10:53:33 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:35.319 10:53:33 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:35.319 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.319 --rc genhtml_branch_coverage=1 00:06:35.319 --rc genhtml_function_coverage=1 00:06:35.319 --rc genhtml_legend=1 00:06:35.319 --rc geninfo_all_blocks=1 00:06:35.319 --rc geninfo_unexecuted_blocks=1 00:06:35.319 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.319 ' 00:06:35.319 10:53:33 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:35.319 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.319 --rc genhtml_branch_coverage=1 00:06:35.319 --rc genhtml_function_coverage=1 00:06:35.319 --rc genhtml_legend=1 00:06:35.319 --rc geninfo_all_blocks=1 00:06:35.319 --rc geninfo_unexecuted_blocks=1 00:06:35.319 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.319 ' 00:06:35.319 10:53:33 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:35.319 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.319 --rc genhtml_branch_coverage=1 00:06:35.319 --rc genhtml_function_coverage=1 00:06:35.319 --rc genhtml_legend=1 00:06:35.319 --rc geninfo_all_blocks=1 00:06:35.319 --rc geninfo_unexecuted_blocks=1 00:06:35.319 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.319 ' 00:06:35.319 10:53:33 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:35.319 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.319 --rc genhtml_branch_coverage=1 00:06:35.319 --rc genhtml_function_coverage=1 00:06:35.319 --rc genhtml_legend=1 00:06:35.319 --rc geninfo_all_blocks=1 00:06:35.319 --rc geninfo_unexecuted_blocks=1 00:06:35.320 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.320 ' 00:06:35.320 10:53:33 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:35.320 10:53:33 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:35.320 10:53:33 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:35.320 10:53:33 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:35.320 10:53:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:35.320 10:53:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.320 10:53:33 -- common/autotest_common.sh@10 -- # set +x 00:06:35.320 ************************************ 00:06:35.320 START TEST default_locks 00:06:35.320 ************************************ 00:06:35.320 10:53:33 -- common/autotest_common.sh@1114 -- # default_locks 00:06:35.320 10:53:33 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=634976 00:06:35.320 10:53:33 -- event/cpu_locks.sh@47 -- # waitforlisten 634976 00:06:35.320 10:53:33 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:35.320 10:53:33 -- common/autotest_common.sh@829 -- # '[' -z 634976 ']' 00:06:35.320 10:53:33 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.320 10:53:33 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:35.320 10:53:33 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.320 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.320 10:53:33 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:35.320 10:53:33 -- common/autotest_common.sh@10 -- # set +x 00:06:35.320 [2024-12-16 10:53:33.918934] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:35.320 [2024-12-16 10:53:33.919016] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid634976 ] 00:06:35.579 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.579 [2024-12-16 10:53:33.988815] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.579 [2024-12-16 10:53:34.024477] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:35.579 [2024-12-16 10:53:34.024606] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.147 10:53:34 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:36.147 10:53:34 -- common/autotest_common.sh@862 -- # return 0 00:06:36.147 10:53:34 -- event/cpu_locks.sh@49 -- # locks_exist 634976 00:06:36.147 10:53:34 -- event/cpu_locks.sh@22 -- # lslocks -p 634976 00:06:36.147 10:53:34 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:37.085 lslocks: write error 00:06:37.085 10:53:35 -- event/cpu_locks.sh@50 -- # killprocess 634976 00:06:37.085 10:53:35 -- common/autotest_common.sh@936 -- # '[' -z 634976 ']' 00:06:37.085 10:53:35 -- common/autotest_common.sh@940 -- # kill -0 634976 00:06:37.085 10:53:35 -- common/autotest_common.sh@941 -- # uname 00:06:37.085 10:53:35 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:37.085 10:53:35 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 634976 00:06:37.085 10:53:35 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:37.085 10:53:35 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:37.085 10:53:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 634976' 00:06:37.085 killing process with pid 634976 00:06:37.085 10:53:35 -- common/autotest_common.sh@955 -- # kill 634976 00:06:37.085 10:53:35 -- common/autotest_common.sh@960 -- # wait 634976 00:06:37.344 10:53:35 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 634976 00:06:37.344 10:53:35 -- common/autotest_common.sh@650 -- # local es=0 00:06:37.344 10:53:35 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 634976 00:06:37.344 10:53:35 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:37.344 10:53:35 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:37.344 10:53:35 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:37.344 10:53:35 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:37.344 10:53:35 -- common/autotest_common.sh@653 -- # waitforlisten 634976 00:06:37.344 10:53:35 -- common/autotest_common.sh@829 -- # '[' -z 634976 ']' 00:06:37.344 10:53:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:37.344 10:53:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:37.344 10:53:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:37.344 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:37.344 10:53:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:37.344 10:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:37.344 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (634976) - No such process 00:06:37.344 ERROR: process (pid: 634976) is no longer running 00:06:37.344 10:53:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:37.344 10:53:35 -- common/autotest_common.sh@862 -- # return 1 00:06:37.344 10:53:35 -- common/autotest_common.sh@653 -- # es=1 00:06:37.345 10:53:35 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:37.345 10:53:35 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:37.345 10:53:35 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:37.345 10:53:35 -- event/cpu_locks.sh@54 -- # no_locks 00:06:37.345 10:53:35 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:37.345 10:53:35 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:37.345 10:53:35 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:37.345 00:06:37.345 real 0m1.838s 00:06:37.345 user 0m1.946s 00:06:37.345 sys 0m0.664s 00:06:37.345 10:53:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.345 10:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:37.345 ************************************ 00:06:37.345 END TEST default_locks 00:06:37.345 ************************************ 00:06:37.345 10:53:35 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:37.345 10:53:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:37.345 10:53:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.345 10:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:37.345 ************************************ 00:06:37.345 START TEST default_locks_via_rpc 00:06:37.345 ************************************ 00:06:37.345 10:53:35 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:37.345 10:53:35 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=635446 00:06:37.345 10:53:35 -- event/cpu_locks.sh@63 -- # waitforlisten 635446 00:06:37.345 10:53:35 -- common/autotest_common.sh@829 -- # '[' -z 635446 ']' 00:06:37.345 10:53:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:37.345 10:53:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:37.345 10:53:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:37.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:37.345 10:53:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:37.345 10:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:37.345 10:53:35 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:37.345 [2024-12-16 10:53:35.801739] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.345 [2024-12-16 10:53:35.801835] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid635446 ] 00:06:37.345 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.345 [2024-12-16 10:53:35.868506] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.345 [2024-12-16 10:53:35.905660] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:37.345 [2024-12-16 10:53:35.905769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.283 10:53:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:38.283 10:53:36 -- common/autotest_common.sh@862 -- # return 0 00:06:38.283 10:53:36 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:38.283 10:53:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:38.283 10:53:36 -- common/autotest_common.sh@10 -- # set +x 00:06:38.283 10:53:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:38.283 10:53:36 -- event/cpu_locks.sh@67 -- # no_locks 00:06:38.283 10:53:36 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:38.283 10:53:36 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:38.283 10:53:36 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:38.283 10:53:36 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:38.283 10:53:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:38.283 10:53:36 -- common/autotest_common.sh@10 -- # set +x 00:06:38.283 10:53:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:38.283 10:53:36 -- event/cpu_locks.sh@71 -- # locks_exist 635446 00:06:38.283 10:53:36 -- event/cpu_locks.sh@22 -- # lslocks -p 635446 00:06:38.283 10:53:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:38.283 10:53:36 -- event/cpu_locks.sh@73 -- # killprocess 635446 00:06:38.283 10:53:36 -- common/autotest_common.sh@936 -- # '[' -z 635446 ']' 00:06:38.283 10:53:36 -- common/autotest_common.sh@940 -- # kill -0 635446 00:06:38.283 10:53:36 -- common/autotest_common.sh@941 -- # uname 00:06:38.283 10:53:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:38.283 10:53:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 635446 00:06:38.542 10:53:36 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:38.542 10:53:36 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:38.542 10:53:36 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 635446' 00:06:38.542 killing process with pid 635446 00:06:38.542 10:53:36 -- common/autotest_common.sh@955 -- # kill 635446 00:06:38.542 10:53:36 -- common/autotest_common.sh@960 -- # wait 635446 00:06:38.802 00:06:38.802 real 0m1.441s 00:06:38.802 user 0m1.530s 00:06:38.802 sys 0m0.469s 00:06:38.802 10:53:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.802 10:53:37 -- common/autotest_common.sh@10 -- # set +x 00:06:38.802 ************************************ 00:06:38.802 END TEST default_locks_via_rpc 00:06:38.802 ************************************ 00:06:38.802 10:53:37 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:38.802 10:53:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:38.802 10:53:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.802 10:53:37 -- common/autotest_common.sh@10 -- # set +x 00:06:38.802 ************************************ 00:06:38.802 START TEST non_locking_app_on_locked_coremask 00:06:38.802 ************************************ 00:06:38.802 10:53:37 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:38.802 10:53:37 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=635745 00:06:38.802 10:53:37 -- event/cpu_locks.sh@81 -- # waitforlisten 635745 /var/tmp/spdk.sock 00:06:38.802 10:53:37 -- common/autotest_common.sh@829 -- # '[' -z 635745 ']' 00:06:38.802 10:53:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:38.802 10:53:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:38.802 10:53:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:38.802 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:38.802 10:53:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:38.802 10:53:37 -- common/autotest_common.sh@10 -- # set +x 00:06:38.802 10:53:37 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:38.802 [2024-12-16 10:53:37.288832] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:38.802 [2024-12-16 10:53:37.288919] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid635745 ] 00:06:38.802 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.802 [2024-12-16 10:53:37.355195] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.802 [2024-12-16 10:53:37.392707] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:38.802 [2024-12-16 10:53:37.392831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.740 10:53:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:39.740 10:53:38 -- common/autotest_common.sh@862 -- # return 0 00:06:39.740 10:53:38 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=635762 00:06:39.740 10:53:38 -- event/cpu_locks.sh@85 -- # waitforlisten 635762 /var/tmp/spdk2.sock 00:06:39.740 10:53:38 -- common/autotest_common.sh@829 -- # '[' -z 635762 ']' 00:06:39.740 10:53:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:39.740 10:53:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:39.740 10:53:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:39.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:39.740 10:53:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:39.740 10:53:38 -- common/autotest_common.sh@10 -- # set +x 00:06:39.740 10:53:38 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:39.740 [2024-12-16 10:53:38.139720] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:39.740 [2024-12-16 10:53:38.139781] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid635762 ] 00:06:39.740 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.740 [2024-12-16 10:53:38.226077] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:39.740 [2024-12-16 10:53:38.226100] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.740 [2024-12-16 10:53:38.297957] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:39.740 [2024-12-16 10:53:38.298067] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.675 10:53:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:40.675 10:53:38 -- common/autotest_common.sh@862 -- # return 0 00:06:40.675 10:53:38 -- event/cpu_locks.sh@87 -- # locks_exist 635745 00:06:40.675 10:53:38 -- event/cpu_locks.sh@22 -- # lslocks -p 635745 00:06:40.675 10:53:38 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:41.244 lslocks: write error 00:06:41.244 10:53:39 -- event/cpu_locks.sh@89 -- # killprocess 635745 00:06:41.244 10:53:39 -- common/autotest_common.sh@936 -- # '[' -z 635745 ']' 00:06:41.244 10:53:39 -- common/autotest_common.sh@940 -- # kill -0 635745 00:06:41.244 10:53:39 -- common/autotest_common.sh@941 -- # uname 00:06:41.244 10:53:39 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:41.244 10:53:39 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 635745 00:06:41.503 10:53:39 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:41.503 10:53:39 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:41.503 10:53:39 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 635745' 00:06:41.503 killing process with pid 635745 00:06:41.503 10:53:39 -- common/autotest_common.sh@955 -- # kill 635745 00:06:41.503 10:53:39 -- common/autotest_common.sh@960 -- # wait 635745 00:06:42.073 10:53:40 -- event/cpu_locks.sh@90 -- # killprocess 635762 00:06:42.073 10:53:40 -- common/autotest_common.sh@936 -- # '[' -z 635762 ']' 00:06:42.073 10:53:40 -- common/autotest_common.sh@940 -- # kill -0 635762 00:06:42.073 10:53:40 -- common/autotest_common.sh@941 -- # uname 00:06:42.073 10:53:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:42.073 10:53:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 635762 00:06:42.073 10:53:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:42.073 10:53:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:42.073 10:53:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 635762' 00:06:42.073 killing process with pid 635762 00:06:42.073 10:53:40 -- common/autotest_common.sh@955 -- # kill 635762 00:06:42.073 10:53:40 -- common/autotest_common.sh@960 -- # wait 635762 00:06:42.332 00:06:42.332 real 0m3.577s 00:06:42.332 user 0m3.802s 00:06:42.332 sys 0m1.197s 00:06:42.332 10:53:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:42.332 10:53:40 -- common/autotest_common.sh@10 -- # set +x 00:06:42.332 ************************************ 00:06:42.332 END TEST non_locking_app_on_locked_coremask 00:06:42.332 ************************************ 00:06:42.332 10:53:40 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:42.332 10:53:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:42.332 10:53:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:42.332 10:53:40 -- common/autotest_common.sh@10 -- # set +x 00:06:42.332 ************************************ 00:06:42.332 START TEST locking_app_on_unlocked_coremask 00:06:42.332 ************************************ 00:06:42.332 10:53:40 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:42.332 10:53:40 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=636339 00:06:42.332 10:53:40 -- event/cpu_locks.sh@99 -- # waitforlisten 636339 /var/tmp/spdk.sock 00:06:42.332 10:53:40 -- common/autotest_common.sh@829 -- # '[' -z 636339 ']' 00:06:42.332 10:53:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:42.332 10:53:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:42.332 10:53:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:42.332 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:42.332 10:53:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:42.332 10:53:40 -- common/autotest_common.sh@10 -- # set +x 00:06:42.332 10:53:40 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:42.332 [2024-12-16 10:53:40.911592] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:42.332 [2024-12-16 10:53:40.911664] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid636339 ] 00:06:42.332 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.591 [2024-12-16 10:53:40.977484] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:42.591 [2024-12-16 10:53:40.977508] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.591 [2024-12-16 10:53:41.014373] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:42.591 [2024-12-16 10:53:41.014477] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.160 10:53:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:43.160 10:53:41 -- common/autotest_common.sh@862 -- # return 0 00:06:43.160 10:53:41 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=636544 00:06:43.160 10:53:41 -- event/cpu_locks.sh@103 -- # waitforlisten 636544 /var/tmp/spdk2.sock 00:06:43.160 10:53:41 -- common/autotest_common.sh@829 -- # '[' -z 636544 ']' 00:06:43.160 10:53:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:43.160 10:53:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:43.160 10:53:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:43.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:43.160 10:53:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:43.160 10:53:41 -- common/autotest_common.sh@10 -- # set +x 00:06:43.160 10:53:41 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:43.160 [2024-12-16 10:53:41.762261] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:43.160 [2024-12-16 10:53:41.762332] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid636544 ] 00:06:43.420 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.420 [2024-12-16 10:53:41.853151] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.420 [2024-12-16 10:53:41.930559] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:43.420 [2024-12-16 10:53:41.930669] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.989 10:53:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:43.989 10:53:42 -- common/autotest_common.sh@862 -- # return 0 00:06:43.989 10:53:42 -- event/cpu_locks.sh@105 -- # locks_exist 636544 00:06:43.989 10:53:42 -- event/cpu_locks.sh@22 -- # lslocks -p 636544 00:06:43.989 10:53:42 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:44.558 lslocks: write error 00:06:44.558 10:53:43 -- event/cpu_locks.sh@107 -- # killprocess 636339 00:06:44.558 10:53:43 -- common/autotest_common.sh@936 -- # '[' -z 636339 ']' 00:06:44.559 10:53:43 -- common/autotest_common.sh@940 -- # kill -0 636339 00:06:44.559 10:53:43 -- common/autotest_common.sh@941 -- # uname 00:06:44.559 10:53:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:44.817 10:53:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 636339 00:06:44.817 10:53:43 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:44.817 10:53:43 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:44.817 10:53:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 636339' 00:06:44.817 killing process with pid 636339 00:06:44.817 10:53:43 -- common/autotest_common.sh@955 -- # kill 636339 00:06:44.817 10:53:43 -- common/autotest_common.sh@960 -- # wait 636339 00:06:45.385 10:53:43 -- event/cpu_locks.sh@108 -- # killprocess 636544 00:06:45.385 10:53:43 -- common/autotest_common.sh@936 -- # '[' -z 636544 ']' 00:06:45.385 10:53:43 -- common/autotest_common.sh@940 -- # kill -0 636544 00:06:45.385 10:53:43 -- common/autotest_common.sh@941 -- # uname 00:06:45.385 10:53:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:45.385 10:53:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 636544 00:06:45.385 10:53:43 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:45.385 10:53:43 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:45.385 10:53:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 636544' 00:06:45.385 killing process with pid 636544 00:06:45.385 10:53:43 -- common/autotest_common.sh@955 -- # kill 636544 00:06:45.385 10:53:43 -- common/autotest_common.sh@960 -- # wait 636544 00:06:45.645 00:06:45.645 real 0m3.284s 00:06:45.645 user 0m3.550s 00:06:45.645 sys 0m1.046s 00:06:45.645 10:53:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:45.645 10:53:44 -- common/autotest_common.sh@10 -- # set +x 00:06:45.645 ************************************ 00:06:45.645 END TEST locking_app_on_unlocked_coremask 00:06:45.645 ************************************ 00:06:45.645 10:53:44 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:45.645 10:53:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:45.645 10:53:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:45.645 10:53:44 -- common/autotest_common.sh@10 -- # set +x 00:06:45.645 ************************************ 00:06:45.645 START TEST locking_app_on_locked_coremask 00:06:45.645 ************************************ 00:06:45.645 10:53:44 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:45.645 10:53:44 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=636918 00:06:45.645 10:53:44 -- event/cpu_locks.sh@116 -- # waitforlisten 636918 /var/tmp/spdk.sock 00:06:45.645 10:53:44 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:45.645 10:53:44 -- common/autotest_common.sh@829 -- # '[' -z 636918 ']' 00:06:45.645 10:53:44 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.645 10:53:44 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:45.645 10:53:44 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.645 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.645 10:53:44 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:45.645 10:53:44 -- common/autotest_common.sh@10 -- # set +x 00:06:45.645 [2024-12-16 10:53:44.247211] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:45.645 [2024-12-16 10:53:44.247286] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid636918 ] 00:06:45.905 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.905 [2024-12-16 10:53:44.314416] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.905 [2024-12-16 10:53:44.347391] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:45.905 [2024-12-16 10:53:44.347505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.473 10:53:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:46.473 10:53:45 -- common/autotest_common.sh@862 -- # return 0 00:06:46.473 10:53:45 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=637181 00:06:46.473 10:53:45 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 637181 /var/tmp/spdk2.sock 00:06:46.473 10:53:45 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:46.473 10:53:45 -- common/autotest_common.sh@650 -- # local es=0 00:06:46.473 10:53:45 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 637181 /var/tmp/spdk2.sock 00:06:46.473 10:53:45 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:46.473 10:53:45 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:46.473 10:53:45 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:46.473 10:53:45 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:46.473 10:53:45 -- common/autotest_common.sh@653 -- # waitforlisten 637181 /var/tmp/spdk2.sock 00:06:46.473 10:53:45 -- common/autotest_common.sh@829 -- # '[' -z 637181 ']' 00:06:46.473 10:53:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:46.473 10:53:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:46.473 10:53:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:46.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:46.473 10:53:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:46.473 10:53:45 -- common/autotest_common.sh@10 -- # set +x 00:06:46.738 [2024-12-16 10:53:45.098818] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:46.738 [2024-12-16 10:53:45.098882] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637181 ] 00:06:46.738 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.738 [2024-12-16 10:53:45.187858] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 636918 has claimed it. 00:06:46.738 [2024-12-16 10:53:45.187896] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:47.310 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (637181) - No such process 00:06:47.310 ERROR: process (pid: 637181) is no longer running 00:06:47.310 10:53:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:47.310 10:53:45 -- common/autotest_common.sh@862 -- # return 1 00:06:47.310 10:53:45 -- common/autotest_common.sh@653 -- # es=1 00:06:47.310 10:53:45 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:47.310 10:53:45 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:47.310 10:53:45 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:47.310 10:53:45 -- event/cpu_locks.sh@122 -- # locks_exist 636918 00:06:47.310 10:53:45 -- event/cpu_locks.sh@22 -- # lslocks -p 636918 00:06:47.310 10:53:45 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:47.879 lslocks: write error 00:06:47.879 10:53:46 -- event/cpu_locks.sh@124 -- # killprocess 636918 00:06:47.879 10:53:46 -- common/autotest_common.sh@936 -- # '[' -z 636918 ']' 00:06:47.879 10:53:46 -- common/autotest_common.sh@940 -- # kill -0 636918 00:06:47.879 10:53:46 -- common/autotest_common.sh@941 -- # uname 00:06:47.879 10:53:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:47.879 10:53:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 636918 00:06:47.879 10:53:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:47.879 10:53:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:47.879 10:53:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 636918' 00:06:47.879 killing process with pid 636918 00:06:47.879 10:53:46 -- common/autotest_common.sh@955 -- # kill 636918 00:06:47.879 10:53:46 -- common/autotest_common.sh@960 -- # wait 636918 00:06:48.139 00:06:48.139 real 0m2.446s 00:06:48.139 user 0m2.682s 00:06:48.139 sys 0m0.728s 00:06:48.139 10:53:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:48.139 10:53:46 -- common/autotest_common.sh@10 -- # set +x 00:06:48.139 ************************************ 00:06:48.139 END TEST locking_app_on_locked_coremask 00:06:48.139 ************************************ 00:06:48.139 10:53:46 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:48.139 10:53:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:48.139 10:53:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:48.139 10:53:46 -- common/autotest_common.sh@10 -- # set +x 00:06:48.139 ************************************ 00:06:48.139 START TEST locking_overlapped_coremask 00:06:48.139 ************************************ 00:06:48.139 10:53:46 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:48.139 10:53:46 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=637486 00:06:48.139 10:53:46 -- event/cpu_locks.sh@133 -- # waitforlisten 637486 /var/tmp/spdk.sock 00:06:48.139 10:53:46 -- common/autotest_common.sh@829 -- # '[' -z 637486 ']' 00:06:48.139 10:53:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.139 10:53:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:48.139 10:53:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.139 10:53:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:48.139 10:53:46 -- common/autotest_common.sh@10 -- # set +x 00:06:48.139 10:53:46 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:48.139 [2024-12-16 10:53:46.739568] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:48.139 [2024-12-16 10:53:46.739661] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637486 ] 00:06:48.398 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.398 [2024-12-16 10:53:46.807655] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:48.398 [2024-12-16 10:53:46.846320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:48.398 [2024-12-16 10:53:46.846474] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:48.398 [2024-12-16 10:53:46.846586] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.398 [2024-12-16 10:53:46.846586] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:48.967 10:53:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:48.967 10:53:47 -- common/autotest_common.sh@862 -- # return 0 00:06:48.967 10:53:47 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=637598 00:06:48.967 10:53:47 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 637598 /var/tmp/spdk2.sock 00:06:48.967 10:53:47 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:48.967 10:53:47 -- common/autotest_common.sh@650 -- # local es=0 00:06:48.967 10:53:47 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 637598 /var/tmp/spdk2.sock 00:06:48.967 10:53:47 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:48.967 10:53:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.967 10:53:47 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:48.967 10:53:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.967 10:53:47 -- common/autotest_common.sh@653 -- # waitforlisten 637598 /var/tmp/spdk2.sock 00:06:48.967 10:53:47 -- common/autotest_common.sh@829 -- # '[' -z 637598 ']' 00:06:48.967 10:53:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:48.967 10:53:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:48.967 10:53:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:48.967 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:48.967 10:53:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:48.967 10:53:47 -- common/autotest_common.sh@10 -- # set +x 00:06:49.227 [2024-12-16 10:53:47.602215] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:49.227 [2024-12-16 10:53:47.602304] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637598 ] 00:06:49.227 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.227 [2024-12-16 10:53:47.697075] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 637486 has claimed it. 00:06:49.227 [2024-12-16 10:53:47.697114] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:49.795 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (637598) - No such process 00:06:49.795 ERROR: process (pid: 637598) is no longer running 00:06:49.795 10:53:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:49.795 10:53:48 -- common/autotest_common.sh@862 -- # return 1 00:06:49.795 10:53:48 -- common/autotest_common.sh@653 -- # es=1 00:06:49.795 10:53:48 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:49.795 10:53:48 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:49.795 10:53:48 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:49.795 10:53:48 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:49.795 10:53:48 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:49.795 10:53:48 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:49.795 10:53:48 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:49.795 10:53:48 -- event/cpu_locks.sh@141 -- # killprocess 637486 00:06:49.795 10:53:48 -- common/autotest_common.sh@936 -- # '[' -z 637486 ']' 00:06:49.795 10:53:48 -- common/autotest_common.sh@940 -- # kill -0 637486 00:06:49.795 10:53:48 -- common/autotest_common.sh@941 -- # uname 00:06:49.795 10:53:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:49.795 10:53:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 637486 00:06:49.795 10:53:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:49.795 10:53:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:49.795 10:53:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 637486' 00:06:49.795 killing process with pid 637486 00:06:49.795 10:53:48 -- common/autotest_common.sh@955 -- # kill 637486 00:06:49.795 10:53:48 -- common/autotest_common.sh@960 -- # wait 637486 00:06:50.055 00:06:50.055 real 0m1.908s 00:06:50.055 user 0m5.500s 00:06:50.055 sys 0m0.458s 00:06:50.055 10:53:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:50.055 10:53:48 -- common/autotest_common.sh@10 -- # set +x 00:06:50.055 ************************************ 00:06:50.055 END TEST locking_overlapped_coremask 00:06:50.055 ************************************ 00:06:50.055 10:53:48 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:50.055 10:53:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:50.055 10:53:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:50.055 10:53:48 -- common/autotest_common.sh@10 -- # set +x 00:06:50.055 ************************************ 00:06:50.055 START TEST locking_overlapped_coremask_via_rpc 00:06:50.055 ************************************ 00:06:50.055 10:53:48 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:50.055 10:53:48 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=637794 00:06:50.055 10:53:48 -- event/cpu_locks.sh@149 -- # waitforlisten 637794 /var/tmp/spdk.sock 00:06:50.055 10:53:48 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:50.055 10:53:48 -- common/autotest_common.sh@829 -- # '[' -z 637794 ']' 00:06:50.055 10:53:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.055 10:53:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:50.055 10:53:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.055 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.055 10:53:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:50.055 10:53:48 -- common/autotest_common.sh@10 -- # set +x 00:06:50.315 [2024-12-16 10:53:48.694333] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:50.315 [2024-12-16 10:53:48.694398] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637794 ] 00:06:50.315 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.315 [2024-12-16 10:53:48.760661] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:50.315 [2024-12-16 10:53:48.760687] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:50.315 [2024-12-16 10:53:48.799458] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:50.315 [2024-12-16 10:53:48.799594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.315 [2024-12-16 10:53:48.799689] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:50.315 [2024-12-16 10:53:48.799691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.253 10:53:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:51.253 10:53:49 -- common/autotest_common.sh@862 -- # return 0 00:06:51.253 10:53:49 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=638061 00:06:51.253 10:53:49 -- event/cpu_locks.sh@153 -- # waitforlisten 638061 /var/tmp/spdk2.sock 00:06:51.253 10:53:49 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:51.253 10:53:49 -- common/autotest_common.sh@829 -- # '[' -z 638061 ']' 00:06:51.253 10:53:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:51.253 10:53:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:51.253 10:53:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:51.253 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:51.253 10:53:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:51.253 10:53:49 -- common/autotest_common.sh@10 -- # set +x 00:06:51.253 [2024-12-16 10:53:49.568480] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:51.253 [2024-12-16 10:53:49.568566] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638061 ] 00:06:51.253 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.253 [2024-12-16 10:53:49.661986] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:51.253 [2024-12-16 10:53:49.662014] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:51.253 [2024-12-16 10:53:49.736321] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:51.253 [2024-12-16 10:53:49.736508] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:51.253 [2024-12-16 10:53:49.739658] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:51.253 [2024-12-16 10:53:49.739660] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:51.823 10:53:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:51.823 10:53:50 -- common/autotest_common.sh@862 -- # return 0 00:06:51.823 10:53:50 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:51.823 10:53:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.823 10:53:50 -- common/autotest_common.sh@10 -- # set +x 00:06:51.823 10:53:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.823 10:53:50 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:51.823 10:53:50 -- common/autotest_common.sh@650 -- # local es=0 00:06:51.823 10:53:50 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:51.823 10:53:50 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:51.823 10:53:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:51.823 10:53:50 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:51.823 10:53:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:51.823 10:53:50 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:51.823 10:53:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.823 10:53:50 -- common/autotest_common.sh@10 -- # set +x 00:06:51.823 [2024-12-16 10:53:50.443670] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 637794 has claimed it. 00:06:52.083 request: 00:06:52.083 { 00:06:52.083 "method": "framework_enable_cpumask_locks", 00:06:52.083 "req_id": 1 00:06:52.083 } 00:06:52.083 Got JSON-RPC error response 00:06:52.083 response: 00:06:52.083 { 00:06:52.083 "code": -32603, 00:06:52.083 "message": "Failed to claim CPU core: 2" 00:06:52.083 } 00:06:52.083 10:53:50 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:52.083 10:53:50 -- common/autotest_common.sh@653 -- # es=1 00:06:52.083 10:53:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:52.083 10:53:50 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:52.083 10:53:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:52.083 10:53:50 -- event/cpu_locks.sh@158 -- # waitforlisten 637794 /var/tmp/spdk.sock 00:06:52.083 10:53:50 -- common/autotest_common.sh@829 -- # '[' -z 637794 ']' 00:06:52.083 10:53:50 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.083 10:53:50 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:52.083 10:53:50 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.083 10:53:50 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:52.083 10:53:50 -- common/autotest_common.sh@10 -- # set +x 00:06:52.083 10:53:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:52.083 10:53:50 -- common/autotest_common.sh@862 -- # return 0 00:06:52.083 10:53:50 -- event/cpu_locks.sh@159 -- # waitforlisten 638061 /var/tmp/spdk2.sock 00:06:52.083 10:53:50 -- common/autotest_common.sh@829 -- # '[' -z 638061 ']' 00:06:52.083 10:53:50 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:52.083 10:53:50 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:52.083 10:53:50 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:52.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:52.083 10:53:50 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:52.083 10:53:50 -- common/autotest_common.sh@10 -- # set +x 00:06:52.342 10:53:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:52.342 10:53:50 -- common/autotest_common.sh@862 -- # return 0 00:06:52.342 10:53:50 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:52.342 10:53:50 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:52.342 10:53:50 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:52.342 10:53:50 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:52.342 00:06:52.342 real 0m2.173s 00:06:52.342 user 0m0.908s 00:06:52.342 sys 0m0.189s 00:06:52.342 10:53:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.342 10:53:50 -- common/autotest_common.sh@10 -- # set +x 00:06:52.342 ************************************ 00:06:52.342 END TEST locking_overlapped_coremask_via_rpc 00:06:52.342 ************************************ 00:06:52.342 10:53:50 -- event/cpu_locks.sh@174 -- # cleanup 00:06:52.342 10:53:50 -- event/cpu_locks.sh@15 -- # [[ -z 637794 ]] 00:06:52.342 10:53:50 -- event/cpu_locks.sh@15 -- # killprocess 637794 00:06:52.342 10:53:50 -- common/autotest_common.sh@936 -- # '[' -z 637794 ']' 00:06:52.342 10:53:50 -- common/autotest_common.sh@940 -- # kill -0 637794 00:06:52.342 10:53:50 -- common/autotest_common.sh@941 -- # uname 00:06:52.342 10:53:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:52.342 10:53:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 637794 00:06:52.342 10:53:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:52.342 10:53:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:52.342 10:53:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 637794' 00:06:52.342 killing process with pid 637794 00:06:52.342 10:53:50 -- common/autotest_common.sh@955 -- # kill 637794 00:06:52.342 10:53:50 -- common/autotest_common.sh@960 -- # wait 637794 00:06:52.910 10:53:51 -- event/cpu_locks.sh@16 -- # [[ -z 638061 ]] 00:06:52.910 10:53:51 -- event/cpu_locks.sh@16 -- # killprocess 638061 00:06:52.910 10:53:51 -- common/autotest_common.sh@936 -- # '[' -z 638061 ']' 00:06:52.910 10:53:51 -- common/autotest_common.sh@940 -- # kill -0 638061 00:06:52.910 10:53:51 -- common/autotest_common.sh@941 -- # uname 00:06:52.910 10:53:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:52.910 10:53:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 638061 00:06:52.910 10:53:51 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:52.910 10:53:51 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:52.910 10:53:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 638061' 00:06:52.910 killing process with pid 638061 00:06:52.910 10:53:51 -- common/autotest_common.sh@955 -- # kill 638061 00:06:52.910 10:53:51 -- common/autotest_common.sh@960 -- # wait 638061 00:06:53.170 10:53:51 -- event/cpu_locks.sh@18 -- # rm -f 00:06:53.170 10:53:51 -- event/cpu_locks.sh@1 -- # cleanup 00:06:53.170 10:53:51 -- event/cpu_locks.sh@15 -- # [[ -z 637794 ]] 00:06:53.170 10:53:51 -- event/cpu_locks.sh@15 -- # killprocess 637794 00:06:53.170 10:53:51 -- common/autotest_common.sh@936 -- # '[' -z 637794 ']' 00:06:53.170 10:53:51 -- common/autotest_common.sh@940 -- # kill -0 637794 00:06:53.170 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (637794) - No such process 00:06:53.170 10:53:51 -- common/autotest_common.sh@963 -- # echo 'Process with pid 637794 is not found' 00:06:53.170 Process with pid 637794 is not found 00:06:53.170 10:53:51 -- event/cpu_locks.sh@16 -- # [[ -z 638061 ]] 00:06:53.170 10:53:51 -- event/cpu_locks.sh@16 -- # killprocess 638061 00:06:53.170 10:53:51 -- common/autotest_common.sh@936 -- # '[' -z 638061 ']' 00:06:53.170 10:53:51 -- common/autotest_common.sh@940 -- # kill -0 638061 00:06:53.170 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (638061) - No such process 00:06:53.170 10:53:51 -- common/autotest_common.sh@963 -- # echo 'Process with pid 638061 is not found' 00:06:53.170 Process with pid 638061 is not found 00:06:53.170 10:53:51 -- event/cpu_locks.sh@18 -- # rm -f 00:06:53.170 00:06:53.170 real 0m17.925s 00:06:53.170 user 0m31.077s 00:06:53.170 sys 0m5.698s 00:06:53.170 10:53:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:53.170 10:53:51 -- common/autotest_common.sh@10 -- # set +x 00:06:53.170 ************************************ 00:06:53.170 END TEST cpu_locks 00:06:53.170 ************************************ 00:06:53.170 00:06:53.170 real 0m42.840s 00:06:53.170 user 1m20.702s 00:06:53.170 sys 0m9.751s 00:06:53.170 10:53:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:53.170 10:53:51 -- common/autotest_common.sh@10 -- # set +x 00:06:53.170 ************************************ 00:06:53.170 END TEST event 00:06:53.170 ************************************ 00:06:53.170 10:53:51 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:53.170 10:53:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:53.170 10:53:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:53.170 10:53:51 -- common/autotest_common.sh@10 -- # set +x 00:06:53.170 ************************************ 00:06:53.170 START TEST thread 00:06:53.170 ************************************ 00:06:53.170 10:53:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:53.430 * Looking for test storage... 00:06:53.430 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:53.430 10:53:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:53.430 10:53:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:53.430 10:53:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:53.430 10:53:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:53.430 10:53:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:53.430 10:53:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:53.430 10:53:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:53.430 10:53:51 -- scripts/common.sh@335 -- # IFS=.-: 00:06:53.430 10:53:51 -- scripts/common.sh@335 -- # read -ra ver1 00:06:53.430 10:53:51 -- scripts/common.sh@336 -- # IFS=.-: 00:06:53.430 10:53:51 -- scripts/common.sh@336 -- # read -ra ver2 00:06:53.430 10:53:51 -- scripts/common.sh@337 -- # local 'op=<' 00:06:53.430 10:53:51 -- scripts/common.sh@339 -- # ver1_l=2 00:06:53.430 10:53:51 -- scripts/common.sh@340 -- # ver2_l=1 00:06:53.430 10:53:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:53.430 10:53:51 -- scripts/common.sh@343 -- # case "$op" in 00:06:53.430 10:53:51 -- scripts/common.sh@344 -- # : 1 00:06:53.430 10:53:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:53.430 10:53:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:53.430 10:53:51 -- scripts/common.sh@364 -- # decimal 1 00:06:53.430 10:53:51 -- scripts/common.sh@352 -- # local d=1 00:06:53.430 10:53:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:53.430 10:53:51 -- scripts/common.sh@354 -- # echo 1 00:06:53.430 10:53:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:53.430 10:53:51 -- scripts/common.sh@365 -- # decimal 2 00:06:53.430 10:53:51 -- scripts/common.sh@352 -- # local d=2 00:06:53.430 10:53:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:53.430 10:53:51 -- scripts/common.sh@354 -- # echo 2 00:06:53.430 10:53:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:53.430 10:53:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:53.430 10:53:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:53.430 10:53:51 -- scripts/common.sh@367 -- # return 0 00:06:53.430 10:53:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:53.430 10:53:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.430 --rc genhtml_branch_coverage=1 00:06:53.430 --rc genhtml_function_coverage=1 00:06:53.430 --rc genhtml_legend=1 00:06:53.430 --rc geninfo_all_blocks=1 00:06:53.430 --rc geninfo_unexecuted_blocks=1 00:06:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.430 ' 00:06:53.430 10:53:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.430 --rc genhtml_branch_coverage=1 00:06:53.430 --rc genhtml_function_coverage=1 00:06:53.430 --rc genhtml_legend=1 00:06:53.430 --rc geninfo_all_blocks=1 00:06:53.430 --rc geninfo_unexecuted_blocks=1 00:06:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.430 ' 00:06:53.430 10:53:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.430 --rc genhtml_branch_coverage=1 00:06:53.430 --rc genhtml_function_coverage=1 00:06:53.430 --rc genhtml_legend=1 00:06:53.430 --rc geninfo_all_blocks=1 00:06:53.430 --rc geninfo_unexecuted_blocks=1 00:06:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.430 ' 00:06:53.430 10:53:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.430 --rc genhtml_branch_coverage=1 00:06:53.430 --rc genhtml_function_coverage=1 00:06:53.430 --rc genhtml_legend=1 00:06:53.430 --rc geninfo_all_blocks=1 00:06:53.430 --rc geninfo_unexecuted_blocks=1 00:06:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.430 ' 00:06:53.430 10:53:51 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:53.430 10:53:51 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:53.430 10:53:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:53.430 10:53:51 -- common/autotest_common.sh@10 -- # set +x 00:06:53.430 ************************************ 00:06:53.430 START TEST thread_poller_perf 00:06:53.430 ************************************ 00:06:53.430 10:53:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:53.430 [2024-12-16 10:53:51.915724] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:53.430 [2024-12-16 10:53:51.915854] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638444 ] 00:06:53.430 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.430 [2024-12-16 10:53:51.987805] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.430 [2024-12-16 10:53:52.024434] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.430 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:54.809 [2024-12-16T09:53:53.434Z] ====================================== 00:06:54.809 [2024-12-16T09:53:53.434Z] busy:2506043062 (cyc) 00:06:54.809 [2024-12-16T09:53:53.434Z] total_run_count: 780000 00:06:54.809 [2024-12-16T09:53:53.434Z] tsc_hz: 2500000000 (cyc) 00:06:54.809 [2024-12-16T09:53:53.434Z] ====================================== 00:06:54.809 [2024-12-16T09:53:53.434Z] poller_cost: 3212 (cyc), 1284 (nsec) 00:06:54.809 00:06:54.809 real 0m1.188s 00:06:54.809 user 0m1.091s 00:06:54.809 sys 0m0.093s 00:06:54.809 10:53:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:54.809 10:53:53 -- common/autotest_common.sh@10 -- # set +x 00:06:54.809 ************************************ 00:06:54.809 END TEST thread_poller_perf 00:06:54.809 ************************************ 00:06:54.809 10:53:53 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:54.809 10:53:53 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:54.809 10:53:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.809 10:53:53 -- common/autotest_common.sh@10 -- # set +x 00:06:54.809 ************************************ 00:06:54.809 START TEST thread_poller_perf 00:06:54.809 ************************************ 00:06:54.809 10:53:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:54.809 [2024-12-16 10:53:53.153064] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:54.809 [2024-12-16 10:53:53.153158] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638730 ] 00:06:54.809 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.809 [2024-12-16 10:53:53.223487] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.809 [2024-12-16 10:53:53.258974] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.809 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:55.747 [2024-12-16T09:53:54.372Z] ====================================== 00:06:55.748 [2024-12-16T09:53:54.373Z] busy:2501903416 (cyc) 00:06:55.748 [2024-12-16T09:53:54.373Z] total_run_count: 13346000 00:06:55.748 [2024-12-16T09:53:54.373Z] tsc_hz: 2500000000 (cyc) 00:06:55.748 [2024-12-16T09:53:54.373Z] ====================================== 00:06:55.748 [2024-12-16T09:53:54.373Z] poller_cost: 187 (cyc), 74 (nsec) 00:06:55.748 00:06:55.748 real 0m1.180s 00:06:55.748 user 0m1.088s 00:06:55.748 sys 0m0.087s 00:06:55.748 10:53:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.748 10:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:55.748 ************************************ 00:06:55.748 END TEST thread_poller_perf 00:06:55.748 ************************************ 00:06:55.748 10:53:54 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:55.748 10:53:54 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:55.748 10:53:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:55.748 10:53:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.748 10:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:55.748 ************************************ 00:06:55.748 START TEST thread_spdk_lock 00:06:55.748 ************************************ 00:06:55.748 10:53:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:56.007 [2024-12-16 10:53:54.383888] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:56.007 [2024-12-16 10:53:54.383975] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639015 ] 00:06:56.007 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.007 [2024-12-16 10:53:54.452727] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:56.007 [2024-12-16 10:53:54.488369] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.007 [2024-12-16 10:53:54.488372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.576 [2024-12-16 10:53:54.978002] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.576 [2024-12-16 10:53:54.978036] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:56.576 [2024-12-16 10:53:54.978046] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1326a80 00:06:56.576 [2024-12-16 10:53:54.978909] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.576 [2024-12-16 10:53:54.979013] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.576 [2024-12-16 10:53:54.979033] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.576 Starting test contend 00:06:56.576 Worker Delay Wait us Hold us Total us 00:06:56.576 0 3 176778 184900 361679 00:06:56.576 1 5 95658 285877 381535 00:06:56.576 PASS test contend 00:06:56.576 Starting test hold_by_poller 00:06:56.576 PASS test hold_by_poller 00:06:56.576 Starting test hold_by_message 00:06:56.576 PASS test hold_by_message 00:06:56.576 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:56.576 100014 assertions passed 00:06:56.576 0 assertions failed 00:06:56.576 00:06:56.576 real 0m0.664s 00:06:56.576 user 0m1.061s 00:06:56.576 sys 0m0.090s 00:06:56.576 10:53:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:56.576 10:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.576 ************************************ 00:06:56.576 END TEST thread_spdk_lock 00:06:56.576 ************************************ 00:06:56.576 00:06:56.576 real 0m3.371s 00:06:56.576 user 0m3.396s 00:06:56.576 sys 0m0.495s 00:06:56.576 10:53:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:56.576 10:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.576 ************************************ 00:06:56.576 END TEST thread 00:06:56.576 ************************************ 00:06:56.576 10:53:55 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:56.576 10:53:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:56.576 10:53:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:56.576 10:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.576 ************************************ 00:06:56.576 START TEST accel 00:06:56.576 ************************************ 00:06:56.576 10:53:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:56.576 * Looking for test storage... 00:06:56.835 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:56.835 10:53:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:56.835 10:53:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:56.835 10:53:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:56.835 10:53:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:56.835 10:53:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:56.835 10:53:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:56.835 10:53:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:56.835 10:53:55 -- scripts/common.sh@335 -- # IFS=.-: 00:06:56.835 10:53:55 -- scripts/common.sh@335 -- # read -ra ver1 00:06:56.835 10:53:55 -- scripts/common.sh@336 -- # IFS=.-: 00:06:56.835 10:53:55 -- scripts/common.sh@336 -- # read -ra ver2 00:06:56.835 10:53:55 -- scripts/common.sh@337 -- # local 'op=<' 00:06:56.835 10:53:55 -- scripts/common.sh@339 -- # ver1_l=2 00:06:56.835 10:53:55 -- scripts/common.sh@340 -- # ver2_l=1 00:06:56.835 10:53:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:56.835 10:53:55 -- scripts/common.sh@343 -- # case "$op" in 00:06:56.835 10:53:55 -- scripts/common.sh@344 -- # : 1 00:06:56.835 10:53:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:56.835 10:53:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:56.835 10:53:55 -- scripts/common.sh@364 -- # decimal 1 00:06:56.835 10:53:55 -- scripts/common.sh@352 -- # local d=1 00:06:56.835 10:53:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:56.835 10:53:55 -- scripts/common.sh@354 -- # echo 1 00:06:56.835 10:53:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:56.835 10:53:55 -- scripts/common.sh@365 -- # decimal 2 00:06:56.835 10:53:55 -- scripts/common.sh@352 -- # local d=2 00:06:56.835 10:53:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:56.835 10:53:55 -- scripts/common.sh@354 -- # echo 2 00:06:56.835 10:53:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:56.835 10:53:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:56.835 10:53:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:56.835 10:53:55 -- scripts/common.sh@367 -- # return 0 00:06:56.836 10:53:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:56.836 10:53:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:56.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.836 --rc genhtml_branch_coverage=1 00:06:56.836 --rc genhtml_function_coverage=1 00:06:56.836 --rc genhtml_legend=1 00:06:56.836 --rc geninfo_all_blocks=1 00:06:56.836 --rc geninfo_unexecuted_blocks=1 00:06:56.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.836 ' 00:06:56.836 10:53:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:56.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.836 --rc genhtml_branch_coverage=1 00:06:56.836 --rc genhtml_function_coverage=1 00:06:56.836 --rc genhtml_legend=1 00:06:56.836 --rc geninfo_all_blocks=1 00:06:56.836 --rc geninfo_unexecuted_blocks=1 00:06:56.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.836 ' 00:06:56.836 10:53:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:56.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.836 --rc genhtml_branch_coverage=1 00:06:56.836 --rc genhtml_function_coverage=1 00:06:56.836 --rc genhtml_legend=1 00:06:56.836 --rc geninfo_all_blocks=1 00:06:56.836 --rc geninfo_unexecuted_blocks=1 00:06:56.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.836 ' 00:06:56.836 10:53:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:56.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.836 --rc genhtml_branch_coverage=1 00:06:56.836 --rc genhtml_function_coverage=1 00:06:56.836 --rc genhtml_legend=1 00:06:56.836 --rc geninfo_all_blocks=1 00:06:56.836 --rc geninfo_unexecuted_blocks=1 00:06:56.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.836 ' 00:06:56.836 10:53:55 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:56.836 10:53:55 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:56.836 10:53:55 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:56.836 10:53:55 -- accel/accel.sh@59 -- # spdk_tgt_pid=639244 00:06:56.836 10:53:55 -- accel/accel.sh@60 -- # waitforlisten 639244 00:06:56.836 10:53:55 -- common/autotest_common.sh@829 -- # '[' -z 639244 ']' 00:06:56.836 10:53:55 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.836 10:53:55 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:56.836 10:53:55 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:56.836 10:53:55 -- accel/accel.sh@58 -- # build_accel_config 00:06:56.836 10:53:55 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.836 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.836 10:53:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.836 10:53:55 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:56.836 10:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:56.836 10:53:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.836 10:53:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.836 10:53:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.836 10:53:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.836 10:53:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.836 10:53:55 -- accel/accel.sh@42 -- # jq -r . 00:06:56.836 [2024-12-16 10:53:55.322431] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:56.836 [2024-12-16 10:53:55.322519] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639244 ] 00:06:56.836 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.836 [2024-12-16 10:53:55.391162] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.836 [2024-12-16 10:53:55.426888] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:56.836 [2024-12-16 10:53:55.427021] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.773 10:53:56 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:57.773 10:53:56 -- common/autotest_common.sh@862 -- # return 0 00:06:57.773 10:53:56 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:57.773 10:53:56 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:57.773 10:53:56 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:57.773 10:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:57.773 10:53:56 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:57.773 10:53:56 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:57.773 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.773 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.773 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.773 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.773 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.773 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.773 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # IFS== 00:06:57.774 10:53:56 -- accel/accel.sh@64 -- # read -r opc module 00:06:57.774 10:53:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:57.774 10:53:56 -- accel/accel.sh@67 -- # killprocess 639244 00:06:57.774 10:53:56 -- common/autotest_common.sh@936 -- # '[' -z 639244 ']' 00:06:57.774 10:53:56 -- common/autotest_common.sh@940 -- # kill -0 639244 00:06:57.774 10:53:56 -- common/autotest_common.sh@941 -- # uname 00:06:57.774 10:53:56 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:57.774 10:53:56 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 639244 00:06:57.774 10:53:56 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:57.774 10:53:56 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:57.774 10:53:56 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 639244' 00:06:57.774 killing process with pid 639244 00:06:57.774 10:53:56 -- common/autotest_common.sh@955 -- # kill 639244 00:06:57.774 10:53:56 -- common/autotest_common.sh@960 -- # wait 639244 00:06:58.034 10:53:56 -- accel/accel.sh@68 -- # trap - ERR 00:06:58.034 10:53:56 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:58.034 10:53:56 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:58.034 10:53:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.034 10:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:58.034 10:53:56 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:58.034 10:53:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:58.034 10:53:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.034 10:53:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.034 10:53:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.034 10:53:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.034 10:53:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.034 10:53:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.034 10:53:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.034 10:53:56 -- accel/accel.sh@42 -- # jq -r . 00:06:58.034 10:53:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.034 10:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:58.034 10:53:56 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:58.034 10:53:56 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:58.034 10:53:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.034 10:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:58.034 ************************************ 00:06:58.034 START TEST accel_missing_filename 00:06:58.034 ************************************ 00:06:58.034 10:53:56 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:58.034 10:53:56 -- common/autotest_common.sh@650 -- # local es=0 00:06:58.034 10:53:56 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:58.034 10:53:56 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:58.034 10:53:56 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.034 10:53:56 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:58.034 10:53:56 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.034 10:53:56 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:58.034 10:53:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:58.034 10:53:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.034 10:53:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.034 10:53:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.034 10:53:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.034 10:53:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.034 10:53:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.034 10:53:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.034 10:53:56 -- accel/accel.sh@42 -- # jq -r . 00:06:58.034 [2024-12-16 10:53:56.656310] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.034 [2024-12-16 10:53:56.656405] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639470 ] 00:06:58.295 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.295 [2024-12-16 10:53:56.726500] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.295 [2024-12-16 10:53:56.762703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.295 [2024-12-16 10:53:56.802477] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:58.295 [2024-12-16 10:53:56.862917] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:58.295 A filename is required. 00:06:58.555 10:53:56 -- common/autotest_common.sh@653 -- # es=234 00:06:58.555 10:53:56 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:58.555 10:53:56 -- common/autotest_common.sh@662 -- # es=106 00:06:58.555 10:53:56 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:58.555 10:53:56 -- common/autotest_common.sh@670 -- # es=1 00:06:58.555 10:53:56 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:58.555 00:06:58.555 real 0m0.289s 00:06:58.555 user 0m0.194s 00:06:58.555 sys 0m0.133s 00:06:58.555 10:53:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.555 10:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:58.555 ************************************ 00:06:58.555 END TEST accel_missing_filename 00:06:58.555 ************************************ 00:06:58.555 10:53:56 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.555 10:53:56 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:58.555 10:53:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.555 10:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:58.555 ************************************ 00:06:58.555 START TEST accel_compress_verify 00:06:58.555 ************************************ 00:06:58.555 10:53:56 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.555 10:53:56 -- common/autotest_common.sh@650 -- # local es=0 00:06:58.555 10:53:56 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.556 10:53:56 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:58.556 10:53:56 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.556 10:53:56 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:58.556 10:53:56 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.556 10:53:56 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.556 10:53:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.556 10:53:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.556 10:53:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.556 10:53:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.556 10:53:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.556 10:53:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.556 10:53:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.556 10:53:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.556 10:53:56 -- accel/accel.sh@42 -- # jq -r . 00:06:58.556 [2024-12-16 10:53:56.995597] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.556 [2024-12-16 10:53:56.995707] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639665 ] 00:06:58.556 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.556 [2024-12-16 10:53:57.064968] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.556 [2024-12-16 10:53:57.100350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.556 [2024-12-16 10:53:57.140082] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:58.816 [2024-12-16 10:53:57.198897] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:58.816 00:06:58.816 Compression does not support the verify option, aborting. 00:06:58.816 10:53:57 -- common/autotest_common.sh@653 -- # es=161 00:06:58.816 10:53:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:58.816 10:53:57 -- common/autotest_common.sh@662 -- # es=33 00:06:58.816 10:53:57 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:58.816 10:53:57 -- common/autotest_common.sh@670 -- # es=1 00:06:58.816 10:53:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:58.816 00:06:58.816 real 0m0.285s 00:06:58.816 user 0m0.186s 00:06:58.816 sys 0m0.137s 00:06:58.816 10:53:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.816 10:53:57 -- common/autotest_common.sh@10 -- # set +x 00:06:58.816 ************************************ 00:06:58.816 END TEST accel_compress_verify 00:06:58.816 ************************************ 00:06:58.816 10:53:57 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:58.816 10:53:57 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:58.816 10:53:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.816 10:53:57 -- common/autotest_common.sh@10 -- # set +x 00:06:58.816 ************************************ 00:06:58.816 START TEST accel_wrong_workload 00:06:58.816 ************************************ 00:06:58.816 10:53:57 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:58.816 10:53:57 -- common/autotest_common.sh@650 -- # local es=0 00:06:58.816 10:53:57 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:58.816 10:53:57 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:58.816 10:53:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.816 10:53:57 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:58.816 10:53:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.816 10:53:57 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:58.816 10:53:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:58.816 10:53:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.816 10:53:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.816 10:53:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.816 10:53:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.816 10:53:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.816 10:53:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.816 10:53:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.816 10:53:57 -- accel/accel.sh@42 -- # jq -r . 00:06:58.816 Unsupported workload type: foobar 00:06:58.816 [2024-12-16 10:53:57.326472] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:58.816 accel_perf options: 00:06:58.816 [-h help message] 00:06:58.816 [-q queue depth per core] 00:06:58.816 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:58.816 [-T number of threads per core 00:06:58.816 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:58.816 [-t time in seconds] 00:06:58.816 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:58.816 [ dif_verify, , dif_generate, dif_generate_copy 00:06:58.816 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:58.816 [-l for compress/decompress workloads, name of uncompressed input file 00:06:58.816 [-S for crc32c workload, use this seed value (default 0) 00:06:58.816 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:58.816 [-f for fill workload, use this BYTE value (default 255) 00:06:58.816 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:58.816 [-y verify result if this switch is on] 00:06:58.816 [-a tasks to allocate per core (default: same value as -q)] 00:06:58.816 Can be used to spread operations across a wider range of memory. 00:06:58.816 10:53:57 -- common/autotest_common.sh@653 -- # es=1 00:06:58.816 10:53:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:58.816 10:53:57 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:58.816 10:53:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:58.816 00:06:58.816 real 0m0.029s 00:06:58.816 user 0m0.012s 00:06:58.816 sys 0m0.016s 00:06:58.816 10:53:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.816 10:53:57 -- common/autotest_common.sh@10 -- # set +x 00:06:58.816 ************************************ 00:06:58.816 END TEST accel_wrong_workload 00:06:58.816 ************************************ 00:06:58.816 Error: writing output failed: Broken pipe 00:06:58.816 10:53:57 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:58.816 10:53:57 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:58.816 10:53:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.816 10:53:57 -- common/autotest_common.sh@10 -- # set +x 00:06:58.816 ************************************ 00:06:58.816 START TEST accel_negative_buffers 00:06:58.816 ************************************ 00:06:58.816 10:53:57 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:58.816 10:53:57 -- common/autotest_common.sh@650 -- # local es=0 00:06:58.816 10:53:57 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:58.816 10:53:57 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:58.816 10:53:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.816 10:53:57 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:58.816 10:53:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.816 10:53:57 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:58.816 10:53:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:58.816 10:53:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.816 10:53:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.816 10:53:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.816 10:53:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.816 10:53:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.816 10:53:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.816 10:53:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.816 10:53:57 -- accel/accel.sh@42 -- # jq -r . 00:06:58.816 -x option must be non-negative. 00:06:58.816 [2024-12-16 10:53:57.398667] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:58.816 accel_perf options: 00:06:58.816 [-h help message] 00:06:58.816 [-q queue depth per core] 00:06:58.816 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:58.816 [-T number of threads per core 00:06:58.816 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:58.816 [-t time in seconds] 00:06:58.816 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:58.816 [ dif_verify, , dif_generate, dif_generate_copy 00:06:58.816 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:58.816 [-l for compress/decompress workloads, name of uncompressed input file 00:06:58.816 [-S for crc32c workload, use this seed value (default 0) 00:06:58.816 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:58.816 [-f for fill workload, use this BYTE value (default 255) 00:06:58.816 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:58.816 [-y verify result if this switch is on] 00:06:58.816 [-a tasks to allocate per core (default: same value as -q)] 00:06:58.816 Can be used to spread operations across a wider range of memory. 00:06:58.816 10:53:57 -- common/autotest_common.sh@653 -- # es=1 00:06:58.816 10:53:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:58.816 10:53:57 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:58.816 10:53:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:58.816 00:06:58.816 real 0m0.030s 00:06:58.816 user 0m0.011s 00:06:58.816 sys 0m0.019s 00:06:58.816 10:53:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.816 10:53:57 -- common/autotest_common.sh@10 -- # set +x 00:06:58.816 ************************************ 00:06:58.816 END TEST accel_negative_buffers 00:06:58.816 ************************************ 00:06:58.816 Error: writing output failed: Broken pipe 00:06:59.076 10:53:57 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:59.076 10:53:57 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:59.076 10:53:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.076 10:53:57 -- common/autotest_common.sh@10 -- # set +x 00:06:59.076 ************************************ 00:06:59.076 START TEST accel_crc32c 00:06:59.076 ************************************ 00:06:59.076 10:53:57 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:59.076 10:53:57 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.076 10:53:57 -- accel/accel.sh@17 -- # local accel_module 00:06:59.076 10:53:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:59.076 10:53:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:59.076 10:53:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.076 10:53:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.076 10:53:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.076 10:53:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.076 10:53:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.076 10:53:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.076 10:53:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.076 10:53:57 -- accel/accel.sh@42 -- # jq -r . 00:06:59.076 [2024-12-16 10:53:57.463144] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:59.076 [2024-12-16 10:53:57.463201] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639727 ] 00:06:59.077 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.077 [2024-12-16 10:53:57.526151] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.077 [2024-12-16 10:53:57.561911] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.511 10:53:58 -- accel/accel.sh@18 -- # out=' 00:07:00.511 SPDK Configuration: 00:07:00.511 Core mask: 0x1 00:07:00.511 00:07:00.511 Accel Perf Configuration: 00:07:00.511 Workload Type: crc32c 00:07:00.511 CRC-32C seed: 32 00:07:00.511 Transfer size: 4096 bytes 00:07:00.511 Vector count 1 00:07:00.511 Module: software 00:07:00.511 Queue depth: 32 00:07:00.511 Allocate depth: 32 00:07:00.511 # threads/core: 1 00:07:00.511 Run time: 1 seconds 00:07:00.511 Verify: Yes 00:07:00.511 00:07:00.511 Running for 1 seconds... 00:07:00.511 00:07:00.511 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.511 ------------------------------------------------------------------------------------ 00:07:00.511 0,0 835712/s 3264 MiB/s 0 0 00:07:00.511 ==================================================================================== 00:07:00.511 Total 835712/s 3264 MiB/s 0 0' 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:00.511 10:53:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:00.511 10:53:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.511 10:53:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.511 10:53:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.511 10:53:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.511 10:53:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.511 10:53:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.511 10:53:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.511 10:53:58 -- accel/accel.sh@42 -- # jq -r . 00:07:00.511 [2024-12-16 10:53:58.744718] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:00.511 [2024-12-16 10:53:58.744804] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639995 ] 00:07:00.511 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.511 [2024-12-16 10:53:58.814027] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.511 [2024-12-16 10:53:58.848352] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=0x1 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=crc32c 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=32 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=software 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=32 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=32 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=1 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val=Yes 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:00.511 10:53:58 -- accel/accel.sh@21 -- # val= 00:07:00.511 10:53:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # IFS=: 00:07:00.511 10:53:58 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@21 -- # val= 00:07:01.546 10:54:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # IFS=: 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@21 -- # val= 00:07:01.546 10:54:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # IFS=: 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@21 -- # val= 00:07:01.546 10:54:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # IFS=: 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@21 -- # val= 00:07:01.546 10:54:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # IFS=: 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@21 -- # val= 00:07:01.546 10:54:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # IFS=: 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@21 -- # val= 00:07:01.546 10:54:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # IFS=: 00:07:01.546 10:54:00 -- accel/accel.sh@20 -- # read -r var val 00:07:01.546 10:54:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.546 10:54:00 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:07:01.546 10:54:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.546 00:07:01.546 real 0m2.564s 00:07:01.546 user 0m2.329s 00:07:01.546 sys 0m0.245s 00:07:01.546 10:54:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:01.546 10:54:00 -- common/autotest_common.sh@10 -- # set +x 00:07:01.546 ************************************ 00:07:01.546 END TEST accel_crc32c 00:07:01.546 ************************************ 00:07:01.546 10:54:00 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:01.546 10:54:00 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:01.546 10:54:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:01.547 10:54:00 -- common/autotest_common.sh@10 -- # set +x 00:07:01.547 ************************************ 00:07:01.547 START TEST accel_crc32c_C2 00:07:01.547 ************************************ 00:07:01.547 10:54:00 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:01.547 10:54:00 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.547 10:54:00 -- accel/accel.sh@17 -- # local accel_module 00:07:01.547 10:54:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:01.547 10:54:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:01.547 10:54:00 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.547 10:54:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.547 10:54:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.547 10:54:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.547 10:54:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.547 10:54:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.547 10:54:00 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.547 10:54:00 -- accel/accel.sh@42 -- # jq -r . 00:07:01.547 [2024-12-16 10:54:00.081839] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:01.547 [2024-12-16 10:54:00.081926] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640303 ] 00:07:01.547 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.547 [2024-12-16 10:54:00.150946] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.828 [2024-12-16 10:54:00.187589] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.825 10:54:01 -- accel/accel.sh@18 -- # out=' 00:07:02.825 SPDK Configuration: 00:07:02.825 Core mask: 0x1 00:07:02.825 00:07:02.825 Accel Perf Configuration: 00:07:02.825 Workload Type: crc32c 00:07:02.825 CRC-32C seed: 0 00:07:02.825 Transfer size: 4096 bytes 00:07:02.825 Vector count 2 00:07:02.825 Module: software 00:07:02.825 Queue depth: 32 00:07:02.825 Allocate depth: 32 00:07:02.825 # threads/core: 1 00:07:02.825 Run time: 1 seconds 00:07:02.825 Verify: Yes 00:07:02.825 00:07:02.825 Running for 1 seconds... 00:07:02.825 00:07:02.825 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:02.825 ------------------------------------------------------------------------------------ 00:07:02.825 0,0 599168/s 4681 MiB/s 0 0 00:07:02.825 ==================================================================================== 00:07:02.825 Total 599168/s 2340 MiB/s 0 0' 00:07:02.825 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:02.825 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:02.825 10:54:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:02.825 10:54:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:02.825 10:54:01 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.825 10:54:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.825 10:54:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.825 10:54:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.825 10:54:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.825 10:54:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.825 10:54:01 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.825 10:54:01 -- accel/accel.sh@42 -- # jq -r . 00:07:02.825 [2024-12-16 10:54:01.368308] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:02.825 [2024-12-16 10:54:01.368395] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640558 ] 00:07:02.825 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.825 [2024-12-16 10:54:01.436326] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.099 [2024-12-16 10:54:01.471341] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=0x1 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=crc32c 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=0 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=software 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=32 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=32 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=1 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val=Yes 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:03.099 10:54:01 -- accel/accel.sh@21 -- # val= 00:07:03.099 10:54:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # IFS=: 00:07:03.099 10:54:01 -- accel/accel.sh@20 -- # read -r var val 00:07:04.097 10:54:02 -- accel/accel.sh@21 -- # val= 00:07:04.097 10:54:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.097 10:54:02 -- accel/accel.sh@20 -- # IFS=: 00:07:04.097 10:54:02 -- accel/accel.sh@20 -- # read -r var val 00:07:04.097 10:54:02 -- accel/accel.sh@21 -- # val= 00:07:04.097 10:54:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.097 10:54:02 -- accel/accel.sh@20 -- # IFS=: 00:07:04.097 10:54:02 -- accel/accel.sh@20 -- # read -r var val 00:07:04.097 10:54:02 -- accel/accel.sh@21 -- # val= 00:07:04.098 10:54:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # IFS=: 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # read -r var val 00:07:04.098 10:54:02 -- accel/accel.sh@21 -- # val= 00:07:04.098 10:54:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # IFS=: 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # read -r var val 00:07:04.098 10:54:02 -- accel/accel.sh@21 -- # val= 00:07:04.098 10:54:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # IFS=: 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # read -r var val 00:07:04.098 10:54:02 -- accel/accel.sh@21 -- # val= 00:07:04.098 10:54:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # IFS=: 00:07:04.098 10:54:02 -- accel/accel.sh@20 -- # read -r var val 00:07:04.098 10:54:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.098 10:54:02 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:07:04.098 10:54:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.098 00:07:04.098 real 0m2.579s 00:07:04.098 user 0m2.332s 00:07:04.098 sys 0m0.254s 00:07:04.098 10:54:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.098 10:54:02 -- common/autotest_common.sh@10 -- # set +x 00:07:04.098 ************************************ 00:07:04.098 END TEST accel_crc32c_C2 00:07:04.098 ************************************ 00:07:04.098 10:54:02 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:04.098 10:54:02 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:04.098 10:54:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.098 10:54:02 -- common/autotest_common.sh@10 -- # set +x 00:07:04.098 ************************************ 00:07:04.098 START TEST accel_copy 00:07:04.098 ************************************ 00:07:04.098 10:54:02 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:07:04.098 10:54:02 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.098 10:54:02 -- accel/accel.sh@17 -- # local accel_module 00:07:04.098 10:54:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:07:04.098 10:54:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.098 10:54:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:04.098 10:54:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.098 10:54:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.098 10:54:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.098 10:54:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.098 10:54:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.098 10:54:02 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.098 10:54:02 -- accel/accel.sh@42 -- # jq -r . 00:07:04.098 [2024-12-16 10:54:02.710515] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:04.098 [2024-12-16 10:54:02.710602] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640775 ] 00:07:04.374 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.374 [2024-12-16 10:54:02.779464] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.374 [2024-12-16 10:54:02.814834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.756 10:54:03 -- accel/accel.sh@18 -- # out=' 00:07:05.756 SPDK Configuration: 00:07:05.756 Core mask: 0x1 00:07:05.756 00:07:05.756 Accel Perf Configuration: 00:07:05.756 Workload Type: copy 00:07:05.756 Transfer size: 4096 bytes 00:07:05.756 Vector count 1 00:07:05.756 Module: software 00:07:05.756 Queue depth: 32 00:07:05.756 Allocate depth: 32 00:07:05.756 # threads/core: 1 00:07:05.756 Run time: 1 seconds 00:07:05.756 Verify: Yes 00:07:05.756 00:07:05.756 Running for 1 seconds... 00:07:05.756 00:07:05.756 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.756 ------------------------------------------------------------------------------------ 00:07:05.756 0,0 540064/s 2109 MiB/s 0 0 00:07:05.756 ==================================================================================== 00:07:05.756 Total 540064/s 2109 MiB/s 0 0' 00:07:05.756 10:54:03 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:03 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:05.756 10:54:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:05.756 10:54:03 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.756 10:54:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.756 10:54:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.756 10:54:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.756 10:54:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.756 10:54:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.756 10:54:03 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.756 10:54:03 -- accel/accel.sh@42 -- # jq -r . 00:07:05.756 [2024-12-16 10:54:03.996386] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:05.756 [2024-12-16 10:54:03.996474] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641102 ] 00:07:05.756 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.756 [2024-12-16 10:54:04.065288] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.756 [2024-12-16 10:54:04.099196] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=0x1 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=copy 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@24 -- # accel_opc=copy 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=software 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@23 -- # accel_module=software 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=32 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=32 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=1 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val=Yes 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:05.756 10:54:04 -- accel/accel.sh@21 -- # val= 00:07:05.756 10:54:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # IFS=: 00:07:05.756 10:54:04 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@21 -- # val= 00:07:06.697 10:54:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # IFS=: 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@21 -- # val= 00:07:06.697 10:54:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # IFS=: 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@21 -- # val= 00:07:06.697 10:54:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # IFS=: 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@21 -- # val= 00:07:06.697 10:54:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # IFS=: 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@21 -- # val= 00:07:06.697 10:54:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # IFS=: 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@21 -- # val= 00:07:06.697 10:54:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # IFS=: 00:07:06.697 10:54:05 -- accel/accel.sh@20 -- # read -r var val 00:07:06.697 10:54:05 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:06.697 10:54:05 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:07:06.697 10:54:05 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:06.697 00:07:06.697 real 0m2.578s 00:07:06.697 user 0m2.313s 00:07:06.697 sys 0m0.270s 00:07:06.697 10:54:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:06.697 10:54:05 -- common/autotest_common.sh@10 -- # set +x 00:07:06.697 ************************************ 00:07:06.697 END TEST accel_copy 00:07:06.697 ************************************ 00:07:06.697 10:54:05 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:06.697 10:54:05 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:06.697 10:54:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:06.697 10:54:05 -- common/autotest_common.sh@10 -- # set +x 00:07:06.697 ************************************ 00:07:06.697 START TEST accel_fill 00:07:06.697 ************************************ 00:07:06.697 10:54:05 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:06.697 10:54:05 -- accel/accel.sh@16 -- # local accel_opc 00:07:06.697 10:54:05 -- accel/accel.sh@17 -- # local accel_module 00:07:06.697 10:54:05 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:06.697 10:54:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:06.697 10:54:05 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.697 10:54:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.697 10:54:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.697 10:54:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.697 10:54:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.697 10:54:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.697 10:54:05 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.697 10:54:05 -- accel/accel.sh@42 -- # jq -r . 00:07:06.956 [2024-12-16 10:54:05.332022] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:06.956 [2024-12-16 10:54:05.332123] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641663 ] 00:07:06.956 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.956 [2024-12-16 10:54:05.401379] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.956 [2024-12-16 10:54:05.436156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.337 10:54:06 -- accel/accel.sh@18 -- # out=' 00:07:08.337 SPDK Configuration: 00:07:08.337 Core mask: 0x1 00:07:08.337 00:07:08.337 Accel Perf Configuration: 00:07:08.337 Workload Type: fill 00:07:08.337 Fill pattern: 0x80 00:07:08.337 Transfer size: 4096 bytes 00:07:08.337 Vector count 1 00:07:08.337 Module: software 00:07:08.337 Queue depth: 64 00:07:08.337 Allocate depth: 64 00:07:08.337 # threads/core: 1 00:07:08.337 Run time: 1 seconds 00:07:08.337 Verify: Yes 00:07:08.337 00:07:08.337 Running for 1 seconds... 00:07:08.337 00:07:08.337 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:08.337 ------------------------------------------------------------------------------------ 00:07:08.337 0,0 944512/s 3689 MiB/s 0 0 00:07:08.337 ==================================================================================== 00:07:08.337 Total 944512/s 3689 MiB/s 0 0' 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:08.337 10:54:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:08.337 10:54:06 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.337 10:54:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.337 10:54:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.337 10:54:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.337 10:54:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.337 10:54:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.337 10:54:06 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.337 10:54:06 -- accel/accel.sh@42 -- # jq -r . 00:07:08.337 [2024-12-16 10:54:06.617858] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:08.337 [2024-12-16 10:54:06.617944] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641982 ] 00:07:08.337 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.337 [2024-12-16 10:54:06.686015] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.337 [2024-12-16 10:54:06.719832] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=0x1 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=fill 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@24 -- # accel_opc=fill 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=0x80 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=software 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@23 -- # accel_module=software 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=64 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=64 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=1 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.337 10:54:06 -- accel/accel.sh@21 -- # val=Yes 00:07:08.337 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.337 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.338 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.338 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.338 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.338 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.338 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:08.338 10:54:06 -- accel/accel.sh@21 -- # val= 00:07:08.338 10:54:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.338 10:54:06 -- accel/accel.sh@20 -- # IFS=: 00:07:08.338 10:54:06 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@21 -- # val= 00:07:09.276 10:54:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # IFS=: 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@21 -- # val= 00:07:09.276 10:54:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # IFS=: 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@21 -- # val= 00:07:09.276 10:54:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # IFS=: 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@21 -- # val= 00:07:09.276 10:54:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # IFS=: 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@21 -- # val= 00:07:09.276 10:54:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # IFS=: 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@21 -- # val= 00:07:09.276 10:54:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # IFS=: 00:07:09.276 10:54:07 -- accel/accel.sh@20 -- # read -r var val 00:07:09.276 10:54:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:09.276 10:54:07 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:07:09.276 10:54:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.276 00:07:09.276 real 0m2.576s 00:07:09.276 user 0m2.326s 00:07:09.276 sys 0m0.259s 00:07:09.276 10:54:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:09.276 10:54:07 -- common/autotest_common.sh@10 -- # set +x 00:07:09.277 ************************************ 00:07:09.277 END TEST accel_fill 00:07:09.277 ************************************ 00:07:09.536 10:54:07 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:09.536 10:54:07 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:09.536 10:54:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:09.536 10:54:07 -- common/autotest_common.sh@10 -- # set +x 00:07:09.536 ************************************ 00:07:09.536 START TEST accel_copy_crc32c 00:07:09.536 ************************************ 00:07:09.536 10:54:07 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:07:09.536 10:54:07 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.536 10:54:07 -- accel/accel.sh@17 -- # local accel_module 00:07:09.536 10:54:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:09.536 10:54:07 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.536 10:54:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:09.536 10:54:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.536 10:54:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.536 10:54:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.536 10:54:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.536 10:54:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.536 10:54:07 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.536 10:54:07 -- accel/accel.sh@42 -- # jq -r . 00:07:09.536 [2024-12-16 10:54:07.947581] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:09.536 [2024-12-16 10:54:07.947706] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642265 ] 00:07:09.536 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.536 [2024-12-16 10:54:08.011937] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.536 [2024-12-16 10:54:08.046764] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.919 10:54:09 -- accel/accel.sh@18 -- # out=' 00:07:10.919 SPDK Configuration: 00:07:10.919 Core mask: 0x1 00:07:10.919 00:07:10.919 Accel Perf Configuration: 00:07:10.919 Workload Type: copy_crc32c 00:07:10.919 CRC-32C seed: 0 00:07:10.919 Vector size: 4096 bytes 00:07:10.919 Transfer size: 4096 bytes 00:07:10.919 Vector count 1 00:07:10.919 Module: software 00:07:10.919 Queue depth: 32 00:07:10.919 Allocate depth: 32 00:07:10.919 # threads/core: 1 00:07:10.919 Run time: 1 seconds 00:07:10.919 Verify: Yes 00:07:10.919 00:07:10.919 Running for 1 seconds... 00:07:10.919 00:07:10.919 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:10.919 ------------------------------------------------------------------------------------ 00:07:10.919 0,0 414784/s 1620 MiB/s 0 0 00:07:10.919 ==================================================================================== 00:07:10.919 Total 414784/s 1620 MiB/s 0 0' 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.919 10:54:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:10.919 10:54:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:10.919 10:54:09 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.919 10:54:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.919 10:54:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.919 10:54:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.919 10:54:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.919 10:54:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.919 10:54:09 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.919 10:54:09 -- accel/accel.sh@42 -- # jq -r . 00:07:10.919 [2024-12-16 10:54:09.228829] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:10.919 [2024-12-16 10:54:09.228915] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642535 ] 00:07:10.919 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.919 [2024-12-16 10:54:09.297212] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.919 [2024-12-16 10:54:09.331384] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.919 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.919 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.919 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.919 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.919 10:54:09 -- accel/accel.sh@21 -- # val=0x1 00:07:10.919 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.919 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.919 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.919 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.919 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.919 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.919 10:54:09 -- accel/accel.sh@21 -- # val=copy_crc32c 00:07:10.919 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val=0 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val=software 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@23 -- # accel_module=software 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val=32 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val=32 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val=1 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val=Yes 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:10.920 10:54:09 -- accel/accel.sh@21 -- # val= 00:07:10.920 10:54:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # IFS=: 00:07:10.920 10:54:09 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@21 -- # val= 00:07:12.300 10:54:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # IFS=: 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@21 -- # val= 00:07:12.300 10:54:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # IFS=: 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@21 -- # val= 00:07:12.300 10:54:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # IFS=: 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@21 -- # val= 00:07:12.300 10:54:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # IFS=: 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@21 -- # val= 00:07:12.300 10:54:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # IFS=: 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@21 -- # val= 00:07:12.300 10:54:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # IFS=: 00:07:12.300 10:54:10 -- accel/accel.sh@20 -- # read -r var val 00:07:12.300 10:54:10 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.300 10:54:10 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:07:12.300 10:54:10 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.300 00:07:12.300 real 0m2.565s 00:07:12.300 user 0m2.332s 00:07:12.300 sys 0m0.243s 00:07:12.300 10:54:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:12.300 10:54:10 -- common/autotest_common.sh@10 -- # set +x 00:07:12.300 ************************************ 00:07:12.300 END TEST accel_copy_crc32c 00:07:12.300 ************************************ 00:07:12.300 10:54:10 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:12.300 10:54:10 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:12.300 10:54:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.300 10:54:10 -- common/autotest_common.sh@10 -- # set +x 00:07:12.300 ************************************ 00:07:12.300 START TEST accel_copy_crc32c_C2 00:07:12.300 ************************************ 00:07:12.300 10:54:10 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:12.300 10:54:10 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.300 10:54:10 -- accel/accel.sh@17 -- # local accel_module 00:07:12.300 10:54:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:12.300 10:54:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:12.300 10:54:10 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.300 10:54:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.300 10:54:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.300 10:54:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.300 10:54:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.300 10:54:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.300 10:54:10 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.300 10:54:10 -- accel/accel.sh@42 -- # jq -r . 00:07:12.300 [2024-12-16 10:54:10.570238] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:12.300 [2024-12-16 10:54:10.570330] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642750 ] 00:07:12.300 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.300 [2024-12-16 10:54:10.641482] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.300 [2024-12-16 10:54:10.677027] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.239 10:54:11 -- accel/accel.sh@18 -- # out=' 00:07:13.239 SPDK Configuration: 00:07:13.239 Core mask: 0x1 00:07:13.239 00:07:13.239 Accel Perf Configuration: 00:07:13.239 Workload Type: copy_crc32c 00:07:13.239 CRC-32C seed: 0 00:07:13.239 Vector size: 4096 bytes 00:07:13.239 Transfer size: 8192 bytes 00:07:13.239 Vector count 2 00:07:13.239 Module: software 00:07:13.239 Queue depth: 32 00:07:13.239 Allocate depth: 32 00:07:13.239 # threads/core: 1 00:07:13.239 Run time: 1 seconds 00:07:13.239 Verify: Yes 00:07:13.239 00:07:13.239 Running for 1 seconds... 00:07:13.239 00:07:13.239 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.239 ------------------------------------------------------------------------------------ 00:07:13.239 0,0 299584/s 2340 MiB/s 0 0 00:07:13.239 ==================================================================================== 00:07:13.239 Total 299584/s 1170 MiB/s 0 0' 00:07:13.239 10:54:11 -- accel/accel.sh@20 -- # IFS=: 00:07:13.239 10:54:11 -- accel/accel.sh@20 -- # read -r var val 00:07:13.239 10:54:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:13.239 10:54:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:13.239 10:54:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.239 10:54:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.239 10:54:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.239 10:54:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.239 10:54:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.239 10:54:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.239 10:54:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.239 10:54:11 -- accel/accel.sh@42 -- # jq -r . 00:07:13.239 [2024-12-16 10:54:11.859469] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:13.239 [2024-12-16 10:54:11.859556] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642901 ] 00:07:13.500 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.500 [2024-12-16 10:54:11.927567] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.500 [2024-12-16 10:54:11.962294] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=0x1 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=copy_crc32c 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=0 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val='8192 bytes' 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=software 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=32 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=32 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=1 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val=Yes 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:13.500 10:54:12 -- accel/accel.sh@21 -- # val= 00:07:13.500 10:54:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # IFS=: 00:07:13.500 10:54:12 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@21 -- # val= 00:07:14.882 10:54:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # IFS=: 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@21 -- # val= 00:07:14.882 10:54:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # IFS=: 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@21 -- # val= 00:07:14.882 10:54:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # IFS=: 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@21 -- # val= 00:07:14.882 10:54:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # IFS=: 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@21 -- # val= 00:07:14.882 10:54:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # IFS=: 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@21 -- # val= 00:07:14.882 10:54:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # IFS=: 00:07:14.882 10:54:13 -- accel/accel.sh@20 -- # read -r var val 00:07:14.882 10:54:13 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.882 10:54:13 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:07:14.882 10:54:13 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.882 00:07:14.882 real 0m2.579s 00:07:14.882 user 0m2.340s 00:07:14.882 sys 0m0.248s 00:07:14.882 10:54:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:14.882 10:54:13 -- common/autotest_common.sh@10 -- # set +x 00:07:14.882 ************************************ 00:07:14.882 END TEST accel_copy_crc32c_C2 00:07:14.882 ************************************ 00:07:14.882 10:54:13 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:14.882 10:54:13 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:14.882 10:54:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:14.882 10:54:13 -- common/autotest_common.sh@10 -- # set +x 00:07:14.883 ************************************ 00:07:14.883 START TEST accel_dualcast 00:07:14.883 ************************************ 00:07:14.883 10:54:13 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:07:14.883 10:54:13 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.883 10:54:13 -- accel/accel.sh@17 -- # local accel_module 00:07:14.883 10:54:13 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:07:14.883 10:54:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:14.883 10:54:13 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.883 10:54:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.883 10:54:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.883 10:54:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.883 10:54:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.883 10:54:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.883 10:54:13 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.883 10:54:13 -- accel/accel.sh@42 -- # jq -r . 00:07:14.883 [2024-12-16 10:54:13.195321] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:14.883 [2024-12-16 10:54:13.195405] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid643125 ] 00:07:14.883 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.883 [2024-12-16 10:54:13.264485] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.883 [2024-12-16 10:54:13.299802] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.265 10:54:14 -- accel/accel.sh@18 -- # out=' 00:07:16.265 SPDK Configuration: 00:07:16.265 Core mask: 0x1 00:07:16.265 00:07:16.265 Accel Perf Configuration: 00:07:16.265 Workload Type: dualcast 00:07:16.265 Transfer size: 4096 bytes 00:07:16.265 Vector count 1 00:07:16.265 Module: software 00:07:16.265 Queue depth: 32 00:07:16.265 Allocate depth: 32 00:07:16.265 # threads/core: 1 00:07:16.265 Run time: 1 seconds 00:07:16.265 Verify: Yes 00:07:16.265 00:07:16.265 Running for 1 seconds... 00:07:16.265 00:07:16.265 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.265 ------------------------------------------------------------------------------------ 00:07:16.265 0,0 622496/s 2431 MiB/s 0 0 00:07:16.265 ==================================================================================== 00:07:16.265 Total 622496/s 2431 MiB/s 0 0' 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:16.265 10:54:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:16.265 10:54:14 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.265 10:54:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.265 10:54:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.265 10:54:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.265 10:54:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.265 10:54:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.265 10:54:14 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.265 10:54:14 -- accel/accel.sh@42 -- # jq -r . 00:07:16.265 [2024-12-16 10:54:14.481374] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:16.265 [2024-12-16 10:54:14.481462] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid643393 ] 00:07:16.265 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.265 [2024-12-16 10:54:14.548901] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.265 [2024-12-16 10:54:14.582823] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=0x1 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=dualcast 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=software 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=32 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=32 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=1 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val=Yes 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:16.265 10:54:14 -- accel/accel.sh@21 -- # val= 00:07:16.265 10:54:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # IFS=: 00:07:16.265 10:54:14 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@21 -- # val= 00:07:17.205 10:54:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # IFS=: 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@21 -- # val= 00:07:17.205 10:54:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # IFS=: 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@21 -- # val= 00:07:17.205 10:54:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # IFS=: 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@21 -- # val= 00:07:17.205 10:54:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # IFS=: 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@21 -- # val= 00:07:17.205 10:54:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # IFS=: 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@21 -- # val= 00:07:17.205 10:54:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # IFS=: 00:07:17.205 10:54:15 -- accel/accel.sh@20 -- # read -r var val 00:07:17.205 10:54:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.205 10:54:15 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:07:17.205 10:54:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.205 00:07:17.205 real 0m2.575s 00:07:17.205 user 0m2.319s 00:07:17.205 sys 0m0.264s 00:07:17.205 10:54:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:17.205 10:54:15 -- common/autotest_common.sh@10 -- # set +x 00:07:17.205 ************************************ 00:07:17.205 END TEST accel_dualcast 00:07:17.205 ************************************ 00:07:17.205 10:54:15 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:17.205 10:54:15 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:17.205 10:54:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:17.205 10:54:15 -- common/autotest_common.sh@10 -- # set +x 00:07:17.205 ************************************ 00:07:17.205 START TEST accel_compare 00:07:17.205 ************************************ 00:07:17.205 10:54:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:07:17.205 10:54:15 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.205 10:54:15 -- accel/accel.sh@17 -- # local accel_module 00:07:17.205 10:54:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:07:17.205 10:54:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:17.205 10:54:15 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.205 10:54:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.205 10:54:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.205 10:54:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.205 10:54:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.205 10:54:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.205 10:54:15 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.205 10:54:15 -- accel/accel.sh@42 -- # jq -r . 00:07:17.205 [2024-12-16 10:54:15.814763] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:17.205 [2024-12-16 10:54:15.814852] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid643680 ] 00:07:17.465 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.465 [2024-12-16 10:54:15.882970] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.465 [2024-12-16 10:54:15.917910] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.848 10:54:17 -- accel/accel.sh@18 -- # out=' 00:07:18.848 SPDK Configuration: 00:07:18.848 Core mask: 0x1 00:07:18.848 00:07:18.848 Accel Perf Configuration: 00:07:18.848 Workload Type: compare 00:07:18.848 Transfer size: 4096 bytes 00:07:18.848 Vector count 1 00:07:18.848 Module: software 00:07:18.848 Queue depth: 32 00:07:18.848 Allocate depth: 32 00:07:18.848 # threads/core: 1 00:07:18.848 Run time: 1 seconds 00:07:18.848 Verify: Yes 00:07:18.848 00:07:18.848 Running for 1 seconds... 00:07:18.848 00:07:18.848 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:18.848 ------------------------------------------------------------------------------------ 00:07:18.848 0,0 797792/s 3116 MiB/s 0 0 00:07:18.848 ==================================================================================== 00:07:18.848 Total 797792/s 3116 MiB/s 0 0' 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:18.848 10:54:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:18.848 10:54:17 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.848 10:54:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.848 10:54:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.848 10:54:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.848 10:54:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.848 10:54:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.848 10:54:17 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.848 10:54:17 -- accel/accel.sh@42 -- # jq -r . 00:07:18.848 [2024-12-16 10:54:17.099384] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:18.848 [2024-12-16 10:54:17.099472] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid643948 ] 00:07:18.848 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.848 [2024-12-16 10:54:17.167500] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.848 [2024-12-16 10:54:17.201938] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=0x1 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=compare 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@24 -- # accel_opc=compare 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=software 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@23 -- # accel_module=software 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=32 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=32 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=1 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val=Yes 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.848 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.848 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.848 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.849 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.849 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:18.849 10:54:17 -- accel/accel.sh@21 -- # val= 00:07:18.849 10:54:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.849 10:54:17 -- accel/accel.sh@20 -- # IFS=: 00:07:18.849 10:54:17 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@21 -- # val= 00:07:19.788 10:54:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # IFS=: 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@21 -- # val= 00:07:19.788 10:54:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # IFS=: 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@21 -- # val= 00:07:19.788 10:54:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # IFS=: 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@21 -- # val= 00:07:19.788 10:54:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # IFS=: 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@21 -- # val= 00:07:19.788 10:54:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # IFS=: 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@21 -- # val= 00:07:19.788 10:54:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # IFS=: 00:07:19.788 10:54:18 -- accel/accel.sh@20 -- # read -r var val 00:07:19.788 10:54:18 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:19.788 10:54:18 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:07:19.788 10:54:18 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:19.788 00:07:19.788 real 0m2.577s 00:07:19.788 user 0m2.334s 00:07:19.788 sys 0m0.252s 00:07:19.788 10:54:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:19.788 10:54:18 -- common/autotest_common.sh@10 -- # set +x 00:07:19.788 ************************************ 00:07:19.788 END TEST accel_compare 00:07:19.788 ************************************ 00:07:19.788 10:54:18 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:19.788 10:54:18 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:20.048 10:54:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:20.048 10:54:18 -- common/autotest_common.sh@10 -- # set +x 00:07:20.048 ************************************ 00:07:20.048 START TEST accel_xor 00:07:20.048 ************************************ 00:07:20.048 10:54:18 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:07:20.048 10:54:18 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.048 10:54:18 -- accel/accel.sh@17 -- # local accel_module 00:07:20.048 10:54:18 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:07:20.048 10:54:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:20.048 10:54:18 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.048 10:54:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.048 10:54:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.048 10:54:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.048 10:54:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.048 10:54:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.048 10:54:18 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.048 10:54:18 -- accel/accel.sh@42 -- # jq -r . 00:07:20.048 [2024-12-16 10:54:18.438395] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:20.048 [2024-12-16 10:54:18.438485] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644234 ] 00:07:20.048 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.048 [2024-12-16 10:54:18.506091] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.048 [2024-12-16 10:54:18.541085] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.430 10:54:19 -- accel/accel.sh@18 -- # out=' 00:07:21.430 SPDK Configuration: 00:07:21.430 Core mask: 0x1 00:07:21.430 00:07:21.430 Accel Perf Configuration: 00:07:21.430 Workload Type: xor 00:07:21.430 Source buffers: 2 00:07:21.430 Transfer size: 4096 bytes 00:07:21.430 Vector count 1 00:07:21.430 Module: software 00:07:21.430 Queue depth: 32 00:07:21.430 Allocate depth: 32 00:07:21.430 # threads/core: 1 00:07:21.430 Run time: 1 seconds 00:07:21.430 Verify: Yes 00:07:21.430 00:07:21.430 Running for 1 seconds... 00:07:21.430 00:07:21.430 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:21.430 ------------------------------------------------------------------------------------ 00:07:21.430 0,0 701696/s 2741 MiB/s 0 0 00:07:21.430 ==================================================================================== 00:07:21.430 Total 701696/s 2741 MiB/s 0 0' 00:07:21.430 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.430 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.430 10:54:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:21.430 10:54:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:21.430 10:54:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.430 10:54:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.430 10:54:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.430 10:54:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.430 10:54:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.430 10:54:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.430 10:54:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.430 10:54:19 -- accel/accel.sh@42 -- # jq -r . 00:07:21.430 [2024-12-16 10:54:19.723078] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:21.430 [2024-12-16 10:54:19.723164] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644386 ] 00:07:21.430 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.430 [2024-12-16 10:54:19.791139] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.430 [2024-12-16 10:54:19.825322] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.430 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.430 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.430 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.430 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.430 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.430 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=0x1 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=xor 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=2 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=software 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@23 -- # accel_module=software 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=32 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=32 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=1 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val=Yes 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:21.431 10:54:19 -- accel/accel.sh@21 -- # val= 00:07:21.431 10:54:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # IFS=: 00:07:21.431 10:54:19 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@21 -- # val= 00:07:22.371 10:54:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # IFS=: 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@21 -- # val= 00:07:22.371 10:54:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # IFS=: 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@21 -- # val= 00:07:22.371 10:54:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # IFS=: 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@21 -- # val= 00:07:22.371 10:54:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # IFS=: 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@21 -- # val= 00:07:22.371 10:54:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # IFS=: 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@21 -- # val= 00:07:22.371 10:54:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # IFS=: 00:07:22.371 10:54:20 -- accel/accel.sh@20 -- # read -r var val 00:07:22.371 10:54:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:22.371 10:54:20 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:22.371 10:54:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.371 00:07:22.371 real 0m2.576s 00:07:22.371 user 0m2.333s 00:07:22.371 sys 0m0.251s 00:07:22.371 10:54:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:22.371 10:54:20 -- common/autotest_common.sh@10 -- # set +x 00:07:22.371 ************************************ 00:07:22.371 END TEST accel_xor 00:07:22.371 ************************************ 00:07:22.630 10:54:21 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:22.630 10:54:21 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:22.630 10:54:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:22.630 10:54:21 -- common/autotest_common.sh@10 -- # set +x 00:07:22.630 ************************************ 00:07:22.630 START TEST accel_xor 00:07:22.630 ************************************ 00:07:22.630 10:54:21 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:07:22.630 10:54:21 -- accel/accel.sh@16 -- # local accel_opc 00:07:22.630 10:54:21 -- accel/accel.sh@17 -- # local accel_module 00:07:22.630 10:54:21 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:22.630 10:54:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:22.630 10:54:21 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.631 10:54:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.631 10:54:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.631 10:54:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.631 10:54:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.631 10:54:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.631 10:54:21 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.631 10:54:21 -- accel/accel.sh@42 -- # jq -r . 00:07:22.631 [2024-12-16 10:54:21.062987] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:22.631 [2024-12-16 10:54:21.063083] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644561 ] 00:07:22.631 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.631 [2024-12-16 10:54:21.131220] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.631 [2024-12-16 10:54:21.166143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.012 10:54:22 -- accel/accel.sh@18 -- # out=' 00:07:24.012 SPDK Configuration: 00:07:24.012 Core mask: 0x1 00:07:24.012 00:07:24.012 Accel Perf Configuration: 00:07:24.012 Workload Type: xor 00:07:24.012 Source buffers: 3 00:07:24.012 Transfer size: 4096 bytes 00:07:24.012 Vector count 1 00:07:24.012 Module: software 00:07:24.012 Queue depth: 32 00:07:24.012 Allocate depth: 32 00:07:24.012 # threads/core: 1 00:07:24.012 Run time: 1 seconds 00:07:24.012 Verify: Yes 00:07:24.012 00:07:24.012 Running for 1 seconds... 00:07:24.012 00:07:24.012 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.012 ------------------------------------------------------------------------------------ 00:07:24.012 0,0 653696/s 2553 MiB/s 0 0 00:07:24.012 ==================================================================================== 00:07:24.012 Total 653696/s 2553 MiB/s 0 0' 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:24.012 10:54:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:24.012 10:54:22 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.012 10:54:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.012 10:54:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.012 10:54:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.012 10:54:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.012 10:54:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.012 10:54:22 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.012 10:54:22 -- accel/accel.sh@42 -- # jq -r . 00:07:24.012 [2024-12-16 10:54:22.348036] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:24.012 [2024-12-16 10:54:22.348121] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644810 ] 00:07:24.012 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.012 [2024-12-16 10:54:22.416932] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.012 [2024-12-16 10:54:22.450656] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=0x1 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=xor 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=3 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=software 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=32 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=32 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=1 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.012 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.012 10:54:22 -- accel/accel.sh@21 -- # val=Yes 00:07:24.012 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.013 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.013 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.013 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.013 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.013 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.013 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:24.013 10:54:22 -- accel/accel.sh@21 -- # val= 00:07:24.013 10:54:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.013 10:54:22 -- accel/accel.sh@20 -- # IFS=: 00:07:24.013 10:54:22 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@21 -- # val= 00:07:25.395 10:54:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # IFS=: 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@21 -- # val= 00:07:25.395 10:54:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # IFS=: 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@21 -- # val= 00:07:25.395 10:54:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # IFS=: 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@21 -- # val= 00:07:25.395 10:54:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # IFS=: 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@21 -- # val= 00:07:25.395 10:54:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # IFS=: 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@21 -- # val= 00:07:25.395 10:54:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # IFS=: 00:07:25.395 10:54:23 -- accel/accel.sh@20 -- # read -r var val 00:07:25.395 10:54:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.395 10:54:23 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:25.395 10:54:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.395 00:07:25.395 real 0m2.574s 00:07:25.395 user 0m2.332s 00:07:25.395 sys 0m0.252s 00:07:25.395 10:54:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:25.395 10:54:23 -- common/autotest_common.sh@10 -- # set +x 00:07:25.395 ************************************ 00:07:25.395 END TEST accel_xor 00:07:25.395 ************************************ 00:07:25.395 10:54:23 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:25.395 10:54:23 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:25.395 10:54:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:25.395 10:54:23 -- common/autotest_common.sh@10 -- # set +x 00:07:25.395 ************************************ 00:07:25.395 START TEST accel_dif_verify 00:07:25.395 ************************************ 00:07:25.395 10:54:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:07:25.395 10:54:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.395 10:54:23 -- accel/accel.sh@17 -- # local accel_module 00:07:25.395 10:54:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:25.395 10:54:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.395 10:54:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:25.395 10:54:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.395 10:54:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.395 10:54:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.395 10:54:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.395 10:54:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.395 10:54:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.395 10:54:23 -- accel/accel.sh@42 -- # jq -r . 00:07:25.395 [2024-12-16 10:54:23.677556] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:25.395 [2024-12-16 10:54:23.677624] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645095 ] 00:07:25.395 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.395 [2024-12-16 10:54:23.741384] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.395 [2024-12-16 10:54:23.776261] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.334 10:54:24 -- accel/accel.sh@18 -- # out=' 00:07:26.334 SPDK Configuration: 00:07:26.334 Core mask: 0x1 00:07:26.334 00:07:26.334 Accel Perf Configuration: 00:07:26.334 Workload Type: dif_verify 00:07:26.334 Vector size: 4096 bytes 00:07:26.334 Transfer size: 4096 bytes 00:07:26.334 Block size: 512 bytes 00:07:26.334 Metadata size: 8 bytes 00:07:26.334 Vector count 1 00:07:26.334 Module: software 00:07:26.334 Queue depth: 32 00:07:26.334 Allocate depth: 32 00:07:26.334 # threads/core: 1 00:07:26.334 Run time: 1 seconds 00:07:26.334 Verify: No 00:07:26.334 00:07:26.334 Running for 1 seconds... 00:07:26.334 00:07:26.334 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:26.334 ------------------------------------------------------------------------------------ 00:07:26.334 0,0 246016/s 976 MiB/s 0 0 00:07:26.334 ==================================================================================== 00:07:26.334 Total 246016/s 961 MiB/s 0 0' 00:07:26.334 10:54:24 -- accel/accel.sh@20 -- # IFS=: 00:07:26.334 10:54:24 -- accel/accel.sh@20 -- # read -r var val 00:07:26.334 10:54:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:26.334 10:54:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:26.334 10:54:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.334 10:54:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.334 10:54:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.334 10:54:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.334 10:54:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.334 10:54:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.334 10:54:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.334 10:54:24 -- accel/accel.sh@42 -- # jq -r . 00:07:26.334 [2024-12-16 10:54:24.957421] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:26.334 [2024-12-16 10:54:24.957508] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645367 ] 00:07:26.593 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.593 [2024-12-16 10:54:25.023971] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.593 [2024-12-16 10:54:25.057652] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val=0x1 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val=dif_verify 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.593 10:54:25 -- accel/accel.sh@21 -- # val=software 00:07:26.593 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.593 10:54:25 -- accel/accel.sh@23 -- # accel_module=software 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.593 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val=32 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val=32 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val=1 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val=No 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:26.594 10:54:25 -- accel/accel.sh@21 -- # val= 00:07:26.594 10:54:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # IFS=: 00:07:26.594 10:54:25 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@21 -- # val= 00:07:27.973 10:54:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # IFS=: 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@21 -- # val= 00:07:27.973 10:54:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # IFS=: 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@21 -- # val= 00:07:27.973 10:54:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # IFS=: 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@21 -- # val= 00:07:27.973 10:54:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # IFS=: 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@21 -- # val= 00:07:27.973 10:54:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # IFS=: 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@21 -- # val= 00:07:27.973 10:54:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # IFS=: 00:07:27.973 10:54:26 -- accel/accel.sh@20 -- # read -r var val 00:07:27.973 10:54:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:27.973 10:54:26 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:27.973 10:54:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:27.973 00:07:27.973 real 0m2.561s 00:07:27.973 user 0m2.325s 00:07:27.973 sys 0m0.247s 00:07:27.973 10:54:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:27.973 10:54:26 -- common/autotest_common.sh@10 -- # set +x 00:07:27.973 ************************************ 00:07:27.973 END TEST accel_dif_verify 00:07:27.973 ************************************ 00:07:27.973 10:54:26 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:27.973 10:54:26 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:27.973 10:54:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:27.973 10:54:26 -- common/autotest_common.sh@10 -- # set +x 00:07:27.973 ************************************ 00:07:27.973 START TEST accel_dif_generate 00:07:27.973 ************************************ 00:07:27.973 10:54:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:27.973 10:54:26 -- accel/accel.sh@16 -- # local accel_opc 00:07:27.973 10:54:26 -- accel/accel.sh@17 -- # local accel_module 00:07:27.973 10:54:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:27.973 10:54:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:27.973 10:54:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:27.973 10:54:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:27.973 10:54:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.973 10:54:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.973 10:54:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:27.973 10:54:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:27.973 10:54:26 -- accel/accel.sh@41 -- # local IFS=, 00:07:27.973 10:54:26 -- accel/accel.sh@42 -- # jq -r . 00:07:27.973 [2024-12-16 10:54:26.294500] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:27.973 [2024-12-16 10:54:26.294602] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645652 ] 00:07:27.973 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.973 [2024-12-16 10:54:26.362238] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.973 [2024-12-16 10:54:26.397153] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.352 10:54:27 -- accel/accel.sh@18 -- # out=' 00:07:29.352 SPDK Configuration: 00:07:29.352 Core mask: 0x1 00:07:29.352 00:07:29.352 Accel Perf Configuration: 00:07:29.352 Workload Type: dif_generate 00:07:29.352 Vector size: 4096 bytes 00:07:29.352 Transfer size: 4096 bytes 00:07:29.352 Block size: 512 bytes 00:07:29.352 Metadata size: 8 bytes 00:07:29.352 Vector count 1 00:07:29.352 Module: software 00:07:29.352 Queue depth: 32 00:07:29.352 Allocate depth: 32 00:07:29.352 # threads/core: 1 00:07:29.352 Run time: 1 seconds 00:07:29.352 Verify: No 00:07:29.352 00:07:29.352 Running for 1 seconds... 00:07:29.352 00:07:29.352 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:29.352 ------------------------------------------------------------------------------------ 00:07:29.352 0,0 292704/s 1161 MiB/s 0 0 00:07:29.352 ==================================================================================== 00:07:29.352 Total 292704/s 1143 MiB/s 0 0' 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:29.352 10:54:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:29.352 10:54:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.352 10:54:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.352 10:54:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.352 10:54:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.352 10:54:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.352 10:54:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.352 10:54:27 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.352 10:54:27 -- accel/accel.sh@42 -- # jq -r . 00:07:29.352 [2024-12-16 10:54:27.579112] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:29.352 [2024-12-16 10:54:27.579194] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645852 ] 00:07:29.352 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.352 [2024-12-16 10:54:27.646319] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.352 [2024-12-16 10:54:27.679994] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val=0x1 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val=dif_generate 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val=software 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val=32 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val=32 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val=1 00:07:29.352 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.352 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.352 10:54:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:29.353 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.353 10:54:27 -- accel/accel.sh@21 -- # val=No 00:07:29.353 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.353 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.353 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:29.353 10:54:27 -- accel/accel.sh@21 -- # val= 00:07:29.353 10:54:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # IFS=: 00:07:29.353 10:54:27 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@21 -- # val= 00:07:30.291 10:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # IFS=: 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@21 -- # val= 00:07:30.291 10:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # IFS=: 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@21 -- # val= 00:07:30.291 10:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # IFS=: 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@21 -- # val= 00:07:30.291 10:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # IFS=: 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@21 -- # val= 00:07:30.291 10:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # IFS=: 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@21 -- # val= 00:07:30.291 10:54:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # IFS=: 00:07:30.291 10:54:28 -- accel/accel.sh@20 -- # read -r var val 00:07:30.291 10:54:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:30.291 10:54:28 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:30.291 10:54:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:30.291 00:07:30.291 real 0m2.572s 00:07:30.291 user 0m2.322s 00:07:30.291 sys 0m0.261s 00:07:30.291 10:54:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:30.291 10:54:28 -- common/autotest_common.sh@10 -- # set +x 00:07:30.291 ************************************ 00:07:30.291 END TEST accel_dif_generate 00:07:30.291 ************************************ 00:07:30.291 10:54:28 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:30.291 10:54:28 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:30.291 10:54:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.291 10:54:28 -- common/autotest_common.sh@10 -- # set +x 00:07:30.291 ************************************ 00:07:30.291 START TEST accel_dif_generate_copy 00:07:30.291 ************************************ 00:07:30.291 10:54:28 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:30.291 10:54:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:30.291 10:54:28 -- accel/accel.sh@17 -- # local accel_module 00:07:30.291 10:54:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:30.291 10:54:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:30.291 10:54:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:30.291 10:54:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:30.291 10:54:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.291 10:54:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.291 10:54:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:30.291 10:54:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:30.291 10:54:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:30.291 10:54:28 -- accel/accel.sh@42 -- # jq -r . 00:07:30.291 [2024-12-16 10:54:28.904193] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:30.291 [2024-12-16 10:54:28.904253] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646037 ] 00:07:30.550 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.551 [2024-12-16 10:54:28.964756] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.551 [2024-12-16 10:54:28.999896] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.931 10:54:30 -- accel/accel.sh@18 -- # out=' 00:07:31.931 SPDK Configuration: 00:07:31.931 Core mask: 0x1 00:07:31.931 00:07:31.931 Accel Perf Configuration: 00:07:31.931 Workload Type: dif_generate_copy 00:07:31.931 Vector size: 4096 bytes 00:07:31.931 Transfer size: 4096 bytes 00:07:31.931 Vector count 1 00:07:31.931 Module: software 00:07:31.931 Queue depth: 32 00:07:31.931 Allocate depth: 32 00:07:31.931 # threads/core: 1 00:07:31.931 Run time: 1 seconds 00:07:31.931 Verify: No 00:07:31.931 00:07:31.931 Running for 1 seconds... 00:07:31.931 00:07:31.931 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:31.931 ------------------------------------------------------------------------------------ 00:07:31.931 0,0 227264/s 901 MiB/s 0 0 00:07:31.931 ==================================================================================== 00:07:31.931 Total 227264/s 887 MiB/s 0 0' 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:31.931 10:54:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:31.931 10:54:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:31.931 10:54:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:31.931 10:54:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.931 10:54:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.931 10:54:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:31.931 10:54:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:31.931 10:54:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:31.931 10:54:30 -- accel/accel.sh@42 -- # jq -r . 00:07:31.931 [2024-12-16 10:54:30.180431] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:31.931 [2024-12-16 10:54:30.180518] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646229 ] 00:07:31.931 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.931 [2024-12-16 10:54:30.247793] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.931 [2024-12-16 10:54:30.281967] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=0x1 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=software 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@23 -- # accel_module=software 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=32 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=32 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=1 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val=No 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:31.931 10:54:30 -- accel/accel.sh@21 -- # val= 00:07:31.931 10:54:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # IFS=: 00:07:31.931 10:54:30 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@21 -- # val= 00:07:32.870 10:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # IFS=: 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@21 -- # val= 00:07:32.870 10:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # IFS=: 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@21 -- # val= 00:07:32.870 10:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # IFS=: 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@21 -- # val= 00:07:32.870 10:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # IFS=: 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@21 -- # val= 00:07:32.870 10:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # IFS=: 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@21 -- # val= 00:07:32.870 10:54:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # IFS=: 00:07:32.870 10:54:31 -- accel/accel.sh@20 -- # read -r var val 00:07:32.870 10:54:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:32.870 10:54:31 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:32.870 10:54:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:32.870 00:07:32.870 real 0m2.554s 00:07:32.870 user 0m2.318s 00:07:32.870 sys 0m0.244s 00:07:32.871 10:54:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:32.871 10:54:31 -- common/autotest_common.sh@10 -- # set +x 00:07:32.871 ************************************ 00:07:32.871 END TEST accel_dif_generate_copy 00:07:32.871 ************************************ 00:07:32.871 10:54:31 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:32.871 10:54:31 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:32.871 10:54:31 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:32.871 10:54:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.871 10:54:31 -- common/autotest_common.sh@10 -- # set +x 00:07:32.871 ************************************ 00:07:32.871 START TEST accel_comp 00:07:32.871 ************************************ 00:07:32.871 10:54:31 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:32.871 10:54:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:33.131 10:54:31 -- accel/accel.sh@17 -- # local accel_module 00:07:33.131 10:54:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.131 10:54:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.131 10:54:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:33.131 10:54:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:33.131 10:54:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.131 10:54:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.131 10:54:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:33.131 10:54:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:33.131 10:54:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:33.131 10:54:31 -- accel/accel.sh@42 -- # jq -r . 00:07:33.131 [2024-12-16 10:54:31.514162] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:33.131 [2024-12-16 10:54:31.514252] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646512 ] 00:07:33.131 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.131 [2024-12-16 10:54:31.581837] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.131 [2024-12-16 10:54:31.616498] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.513 10:54:32 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:34.513 00:07:34.513 SPDK Configuration: 00:07:34.513 Core mask: 0x1 00:07:34.513 00:07:34.513 Accel Perf Configuration: 00:07:34.513 Workload Type: compress 00:07:34.513 Transfer size: 4096 bytes 00:07:34.513 Vector count 1 00:07:34.513 Module: software 00:07:34.513 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:34.513 Queue depth: 32 00:07:34.513 Allocate depth: 32 00:07:34.513 # threads/core: 1 00:07:34.513 Run time: 1 seconds 00:07:34.513 Verify: No 00:07:34.513 00:07:34.513 Running for 1 seconds... 00:07:34.513 00:07:34.513 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:34.513 ------------------------------------------------------------------------------------ 00:07:34.513 0,0 67072/s 279 MiB/s 0 0 00:07:34.513 ==================================================================================== 00:07:34.513 Total 67072/s 262 MiB/s 0 0' 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:34.513 10:54:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:34.513 10:54:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:34.513 10:54:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:34.513 10:54:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.513 10:54:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.513 10:54:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:34.513 10:54:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:34.513 10:54:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:34.513 10:54:32 -- accel/accel.sh@42 -- # jq -r . 00:07:34.513 [2024-12-16 10:54:32.797850] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:34.513 [2024-12-16 10:54:32.797938] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646778 ] 00:07:34.513 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.513 [2024-12-16 10:54:32.865210] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.513 [2024-12-16 10:54:32.898783] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=0x1 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=compress 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=software 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=32 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=32 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=1 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val=No 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:34.513 10:54:32 -- accel/accel.sh@21 -- # val= 00:07:34.513 10:54:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # IFS=: 00:07:34.513 10:54:32 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@21 -- # val= 00:07:35.453 10:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # IFS=: 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@21 -- # val= 00:07:35.453 10:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # IFS=: 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@21 -- # val= 00:07:35.453 10:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # IFS=: 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@21 -- # val= 00:07:35.453 10:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # IFS=: 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@21 -- # val= 00:07:35.453 10:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # IFS=: 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@21 -- # val= 00:07:35.453 10:54:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # IFS=: 00:07:35.453 10:54:34 -- accel/accel.sh@20 -- # read -r var val 00:07:35.453 10:54:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:35.453 10:54:34 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:35.453 10:54:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:35.453 00:07:35.453 real 0m2.572s 00:07:35.453 user 0m2.325s 00:07:35.453 sys 0m0.256s 00:07:35.453 10:54:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:35.453 10:54:34 -- common/autotest_common.sh@10 -- # set +x 00:07:35.453 ************************************ 00:07:35.453 END TEST accel_comp 00:07:35.453 ************************************ 00:07:35.713 10:54:34 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.713 10:54:34 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:35.713 10:54:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:35.713 10:54:34 -- common/autotest_common.sh@10 -- # set +x 00:07:35.713 ************************************ 00:07:35.713 START TEST accel_decomp 00:07:35.714 ************************************ 00:07:35.714 10:54:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.714 10:54:34 -- accel/accel.sh@16 -- # local accel_opc 00:07:35.714 10:54:34 -- accel/accel.sh@17 -- # local accel_module 00:07:35.714 10:54:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.714 10:54:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.714 10:54:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:35.714 10:54:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:35.714 10:54:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.714 10:54:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.714 10:54:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:35.714 10:54:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:35.714 10:54:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:35.714 10:54:34 -- accel/accel.sh@42 -- # jq -r . 00:07:35.714 [2024-12-16 10:54:34.132059] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:35.714 [2024-12-16 10:54:34.132144] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647065 ] 00:07:35.714 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.714 [2024-12-16 10:54:34.199295] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.714 [2024-12-16 10:54:34.234256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.093 10:54:35 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:37.093 00:07:37.093 SPDK Configuration: 00:07:37.093 Core mask: 0x1 00:07:37.093 00:07:37.093 Accel Perf Configuration: 00:07:37.093 Workload Type: decompress 00:07:37.093 Transfer size: 4096 bytes 00:07:37.093 Vector count 1 00:07:37.093 Module: software 00:07:37.093 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:37.093 Queue depth: 32 00:07:37.093 Allocate depth: 32 00:07:37.093 # threads/core: 1 00:07:37.093 Run time: 1 seconds 00:07:37.093 Verify: Yes 00:07:37.093 00:07:37.093 Running for 1 seconds... 00:07:37.093 00:07:37.093 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:37.093 ------------------------------------------------------------------------------------ 00:07:37.093 0,0 93280/s 171 MiB/s 0 0 00:07:37.093 ==================================================================================== 00:07:37.093 Total 93280/s 364 MiB/s 0 0' 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:37.093 10:54:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:37.093 10:54:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:37.093 10:54:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:37.093 10:54:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:37.093 10:54:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:37.093 10:54:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:37.093 10:54:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:37.093 10:54:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:37.093 10:54:35 -- accel/accel.sh@42 -- # jq -r . 00:07:37.093 [2024-12-16 10:54:35.416623] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:37.093 [2024-12-16 10:54:35.416711] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647333 ] 00:07:37.093 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.093 [2024-12-16 10:54:35.483935] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.093 [2024-12-16 10:54:35.518138] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val=0x1 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val=decompress 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.093 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.093 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.093 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val=software 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@23 -- # accel_module=software 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val=32 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val=32 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val=1 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val=Yes 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:37.094 10:54:35 -- accel/accel.sh@21 -- # val= 00:07:37.094 10:54:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # IFS=: 00:07:37.094 10:54:35 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@21 -- # val= 00:07:38.475 10:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # IFS=: 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@21 -- # val= 00:07:38.475 10:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # IFS=: 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@21 -- # val= 00:07:38.475 10:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # IFS=: 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@21 -- # val= 00:07:38.475 10:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # IFS=: 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@21 -- # val= 00:07:38.475 10:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # IFS=: 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@21 -- # val= 00:07:38.475 10:54:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # IFS=: 00:07:38.475 10:54:36 -- accel/accel.sh@20 -- # read -r var val 00:07:38.475 10:54:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:38.475 10:54:36 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:38.475 10:54:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:38.475 00:07:38.475 real 0m2.575s 00:07:38.475 user 0m2.336s 00:07:38.475 sys 0m0.248s 00:07:38.475 10:54:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:38.475 10:54:36 -- common/autotest_common.sh@10 -- # set +x 00:07:38.475 ************************************ 00:07:38.475 END TEST accel_decomp 00:07:38.475 ************************************ 00:07:38.475 10:54:36 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:38.475 10:54:36 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:38.475 10:54:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:38.475 10:54:36 -- common/autotest_common.sh@10 -- # set +x 00:07:38.475 ************************************ 00:07:38.475 START TEST accel_decmop_full 00:07:38.475 ************************************ 00:07:38.475 10:54:36 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:38.475 10:54:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:38.475 10:54:36 -- accel/accel.sh@17 -- # local accel_module 00:07:38.475 10:54:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:38.475 10:54:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:38.475 10:54:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:38.475 10:54:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:38.475 10:54:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.475 10:54:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.475 10:54:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:38.475 10:54:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:38.475 10:54:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:38.475 10:54:36 -- accel/accel.sh@42 -- # jq -r . 00:07:38.475 [2024-12-16 10:54:36.751219] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:38.475 [2024-12-16 10:54:36.751309] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647526 ] 00:07:38.475 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.475 [2024-12-16 10:54:36.819299] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.475 [2024-12-16 10:54:36.854264] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.415 10:54:38 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:39.415 00:07:39.415 SPDK Configuration: 00:07:39.415 Core mask: 0x1 00:07:39.415 00:07:39.415 Accel Perf Configuration: 00:07:39.415 Workload Type: decompress 00:07:39.415 Transfer size: 111250 bytes 00:07:39.415 Vector count 1 00:07:39.415 Module: software 00:07:39.415 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:39.415 Queue depth: 32 00:07:39.415 Allocate depth: 32 00:07:39.415 # threads/core: 1 00:07:39.415 Run time: 1 seconds 00:07:39.415 Verify: Yes 00:07:39.415 00:07:39.415 Running for 1 seconds... 00:07:39.415 00:07:39.415 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:39.415 ------------------------------------------------------------------------------------ 00:07:39.415 0,0 5888/s 243 MiB/s 0 0 00:07:39.415 ==================================================================================== 00:07:39.415 Total 5888/s 624 MiB/s 0 0' 00:07:39.415 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.415 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.415 10:54:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:39.415 10:54:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:39.415 10:54:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:39.415 10:54:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:39.415 10:54:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.415 10:54:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.415 10:54:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:39.415 10:54:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:39.415 10:54:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:39.415 10:54:38 -- accel/accel.sh@42 -- # jq -r . 00:07:39.675 [2024-12-16 10:54:38.045409] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:39.675 [2024-12-16 10:54:38.045495] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647671 ] 00:07:39.675 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.675 [2024-12-16 10:54:38.112884] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.675 [2024-12-16 10:54:38.146979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.675 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.675 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.675 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.675 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.675 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.675 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.675 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.675 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.675 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.675 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=0x1 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=decompress 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=software 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@23 -- # accel_module=software 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=32 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=32 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=1 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val=Yes 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:39.676 10:54:38 -- accel/accel.sh@21 -- # val= 00:07:39.676 10:54:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # IFS=: 00:07:39.676 10:54:38 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@21 -- # val= 00:07:41.057 10:54:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # IFS=: 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@21 -- # val= 00:07:41.057 10:54:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # IFS=: 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@21 -- # val= 00:07:41.057 10:54:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # IFS=: 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@21 -- # val= 00:07:41.057 10:54:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # IFS=: 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@21 -- # val= 00:07:41.057 10:54:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # IFS=: 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@21 -- # val= 00:07:41.057 10:54:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # IFS=: 00:07:41.057 10:54:39 -- accel/accel.sh@20 -- # read -r var val 00:07:41.057 10:54:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:41.057 10:54:39 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:41.057 10:54:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:41.057 00:07:41.057 real 0m2.591s 00:07:41.057 user 0m2.338s 00:07:41.057 sys 0m0.260s 00:07:41.057 10:54:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:41.057 10:54:39 -- common/autotest_common.sh@10 -- # set +x 00:07:41.057 ************************************ 00:07:41.057 END TEST accel_decmop_full 00:07:41.057 ************************************ 00:07:41.057 10:54:39 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:41.057 10:54:39 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:41.057 10:54:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:41.057 10:54:39 -- common/autotest_common.sh@10 -- # set +x 00:07:41.057 ************************************ 00:07:41.057 START TEST accel_decomp_mcore 00:07:41.057 ************************************ 00:07:41.057 10:54:39 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:41.057 10:54:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:41.057 10:54:39 -- accel/accel.sh@17 -- # local accel_module 00:07:41.057 10:54:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:41.057 10:54:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:41.057 10:54:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.057 10:54:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:41.057 10:54:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.057 10:54:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.057 10:54:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:41.057 10:54:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:41.057 10:54:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:41.057 10:54:39 -- accel/accel.sh@42 -- # jq -r . 00:07:41.057 [2024-12-16 10:54:39.381409] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:41.057 [2024-12-16 10:54:39.381473] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647928 ] 00:07:41.057 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.057 [2024-12-16 10:54:39.439820] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:41.057 [2024-12-16 10:54:39.477268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:41.057 [2024-12-16 10:54:39.477363] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:41.057 [2024-12-16 10:54:39.477451] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:41.057 [2024-12-16 10:54:39.477453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.523 10:54:40 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:42.523 00:07:42.523 SPDK Configuration: 00:07:42.523 Core mask: 0xf 00:07:42.523 00:07:42.523 Accel Perf Configuration: 00:07:42.523 Workload Type: decompress 00:07:42.523 Transfer size: 4096 bytes 00:07:42.523 Vector count 1 00:07:42.523 Module: software 00:07:42.523 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:42.523 Queue depth: 32 00:07:42.523 Allocate depth: 32 00:07:42.523 # threads/core: 1 00:07:42.523 Run time: 1 seconds 00:07:42.523 Verify: Yes 00:07:42.523 00:07:42.523 Running for 1 seconds... 00:07:42.523 00:07:42.523 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:42.523 ------------------------------------------------------------------------------------ 00:07:42.523 0,0 73408/s 135 MiB/s 0 0 00:07:42.523 3,0 77312/s 142 MiB/s 0 0 00:07:42.523 2,0 77056/s 141 MiB/s 0 0 00:07:42.523 1,0 77120/s 142 MiB/s 0 0 00:07:42.523 ==================================================================================== 00:07:42.523 Total 304896/s 1191 MiB/s 0 0' 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:42.523 10:54:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:42.523 10:54:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:42.523 10:54:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:42.523 10:54:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.523 10:54:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.523 10:54:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:42.523 10:54:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:42.523 10:54:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:42.523 10:54:40 -- accel/accel.sh@42 -- # jq -r . 00:07:42.523 [2024-12-16 10:54:40.670779] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:42.523 [2024-12-16 10:54:40.670864] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid648201 ] 00:07:42.523 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.523 [2024-12-16 10:54:40.740075] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:42.523 [2024-12-16 10:54:40.777281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.523 [2024-12-16 10:54:40.777380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:42.523 [2024-12-16 10:54:40.777442] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:42.523 [2024-12-16 10:54:40.777443] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=0xf 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=decompress 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=software 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=32 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=32 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=1 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val=Yes 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:42.523 10:54:40 -- accel/accel.sh@21 -- # val= 00:07:42.523 10:54:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # IFS=: 00:07:42.523 10:54:40 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@21 -- # val= 00:07:43.461 10:54:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # IFS=: 00:07:43.461 10:54:41 -- accel/accel.sh@20 -- # read -r var val 00:07:43.461 10:54:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:43.461 10:54:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:43.461 10:54:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:43.461 00:07:43.461 real 0m2.585s 00:07:43.461 user 0m8.997s 00:07:43.461 sys 0m0.244s 00:07:43.461 10:54:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:43.461 10:54:41 -- common/autotest_common.sh@10 -- # set +x 00:07:43.461 ************************************ 00:07:43.461 END TEST accel_decomp_mcore 00:07:43.461 ************************************ 00:07:43.461 10:54:41 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:43.461 10:54:41 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:43.461 10:54:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:43.461 10:54:41 -- common/autotest_common.sh@10 -- # set +x 00:07:43.461 ************************************ 00:07:43.461 START TEST accel_decomp_full_mcore 00:07:43.461 ************************************ 00:07:43.461 10:54:42 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:43.461 10:54:42 -- accel/accel.sh@16 -- # local accel_opc 00:07:43.461 10:54:42 -- accel/accel.sh@17 -- # local accel_module 00:07:43.461 10:54:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:43.461 10:54:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:43.461 10:54:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:43.461 10:54:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:43.461 10:54:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.461 10:54:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.461 10:54:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:43.461 10:54:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:43.461 10:54:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:43.461 10:54:42 -- accel/accel.sh@42 -- # jq -r . 00:07:43.461 [2024-12-16 10:54:42.025469] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:43.461 [2024-12-16 10:54:42.025552] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid648488 ] 00:07:43.461 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.720 [2024-12-16 10:54:42.093997] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:43.720 [2024-12-16 10:54:42.131642] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:43.720 [2024-12-16 10:54:42.131738] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:43.720 [2024-12-16 10:54:42.131825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:43.720 [2024-12-16 10:54:42.131826] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.100 10:54:43 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:45.100 00:07:45.100 SPDK Configuration: 00:07:45.100 Core mask: 0xf 00:07:45.100 00:07:45.100 Accel Perf Configuration: 00:07:45.100 Workload Type: decompress 00:07:45.100 Transfer size: 111250 bytes 00:07:45.100 Vector count 1 00:07:45.100 Module: software 00:07:45.100 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:45.100 Queue depth: 32 00:07:45.100 Allocate depth: 32 00:07:45.100 # threads/core: 1 00:07:45.100 Run time: 1 seconds 00:07:45.100 Verify: Yes 00:07:45.100 00:07:45.100 Running for 1 seconds... 00:07:45.100 00:07:45.100 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:45.100 ------------------------------------------------------------------------------------ 00:07:45.100 0,0 5408/s 223 MiB/s 0 0 00:07:45.100 3,0 5760/s 237 MiB/s 0 0 00:07:45.100 2,0 5760/s 237 MiB/s 0 0 00:07:45.100 1,0 5760/s 237 MiB/s 0 0 00:07:45.100 ==================================================================================== 00:07:45.100 Total 22688/s 2407 MiB/s 0 0' 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:45.100 10:54:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:45.100 10:54:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:45.100 10:54:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:45.100 10:54:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.100 10:54:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.100 10:54:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:45.100 10:54:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:45.100 10:54:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:45.100 10:54:43 -- accel/accel.sh@42 -- # jq -r . 00:07:45.100 [2024-12-16 10:54:43.332099] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:45.100 [2024-12-16 10:54:43.332185] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid648760 ] 00:07:45.100 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.100 [2024-12-16 10:54:43.401633] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:45.100 [2024-12-16 10:54:43.438396] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:45.100 [2024-12-16 10:54:43.438492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:45.100 [2024-12-16 10:54:43.438582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:45.100 [2024-12-16 10:54:43.438583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val=0xf 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val=decompress 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val=software 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@23 -- # accel_module=software 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.100 10:54:43 -- accel/accel.sh@21 -- # val=32 00:07:45.100 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.100 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.101 10:54:43 -- accel/accel.sh@21 -- # val=32 00:07:45.101 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.101 10:54:43 -- accel/accel.sh@21 -- # val=1 00:07:45.101 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.101 10:54:43 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:45.101 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.101 10:54:43 -- accel/accel.sh@21 -- # val=Yes 00:07:45.101 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.101 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.101 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:45.101 10:54:43 -- accel/accel.sh@21 -- # val= 00:07:45.101 10:54:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # IFS=: 00:07:45.101 10:54:43 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@21 -- # val= 00:07:46.038 10:54:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # IFS=: 00:07:46.038 10:54:44 -- accel/accel.sh@20 -- # read -r var val 00:07:46.038 10:54:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:46.038 10:54:44 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:46.038 10:54:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.038 00:07:46.038 real 0m2.624s 00:07:46.038 user 0m9.058s 00:07:46.038 sys 0m0.279s 00:07:46.038 10:54:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:46.038 10:54:44 -- common/autotest_common.sh@10 -- # set +x 00:07:46.038 ************************************ 00:07:46.038 END TEST accel_decomp_full_mcore 00:07:46.038 ************************************ 00:07:46.298 10:54:44 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:46.298 10:54:44 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:46.298 10:54:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:46.298 10:54:44 -- common/autotest_common.sh@10 -- # set +x 00:07:46.298 ************************************ 00:07:46.298 START TEST accel_decomp_mthread 00:07:46.298 ************************************ 00:07:46.298 10:54:44 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:46.298 10:54:44 -- accel/accel.sh@16 -- # local accel_opc 00:07:46.298 10:54:44 -- accel/accel.sh@17 -- # local accel_module 00:07:46.298 10:54:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:46.298 10:54:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:46.298 10:54:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:46.298 10:54:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:46.298 10:54:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.298 10:54:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.298 10:54:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:46.298 10:54:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:46.298 10:54:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:46.298 10:54:44 -- accel/accel.sh@42 -- # jq -r . 00:07:46.298 [2024-12-16 10:54:44.697725] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:46.299 [2024-12-16 10:54:44.697811] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid649053 ] 00:07:46.299 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.299 [2024-12-16 10:54:44.765898] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.299 [2024-12-16 10:54:44.800829] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.678 10:54:45 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:47.678 00:07:47.678 SPDK Configuration: 00:07:47.678 Core mask: 0x1 00:07:47.678 00:07:47.678 Accel Perf Configuration: 00:07:47.678 Workload Type: decompress 00:07:47.678 Transfer size: 4096 bytes 00:07:47.678 Vector count 1 00:07:47.678 Module: software 00:07:47.678 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:47.678 Queue depth: 32 00:07:47.678 Allocate depth: 32 00:07:47.678 # threads/core: 2 00:07:47.678 Run time: 1 seconds 00:07:47.678 Verify: Yes 00:07:47.678 00:07:47.678 Running for 1 seconds... 00:07:47.678 00:07:47.678 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:47.678 ------------------------------------------------------------------------------------ 00:07:47.678 0,1 47744/s 87 MiB/s 0 0 00:07:47.678 0,0 47584/s 87 MiB/s 0 0 00:07:47.678 ==================================================================================== 00:07:47.678 Total 95328/s 372 MiB/s 0 0' 00:07:47.678 10:54:45 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:45 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:47.678 10:54:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:47.678 10:54:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:47.678 10:54:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:47.678 10:54:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.678 10:54:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.678 10:54:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:47.678 10:54:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:47.678 10:54:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:47.678 10:54:45 -- accel/accel.sh@42 -- # jq -r . 00:07:47.678 [2024-12-16 10:54:45.987551] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:47.678 [2024-12-16 10:54:45.987637] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid649197 ] 00:07:47.678 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.678 [2024-12-16 10:54:46.055168] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.678 [2024-12-16 10:54:46.089518] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=0x1 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=decompress 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=software 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@23 -- # accel_module=software 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=32 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=32 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=2 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val=Yes 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:47.678 10:54:46 -- accel/accel.sh@21 -- # val= 00:07:47.678 10:54:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # IFS=: 00:07:47.678 10:54:46 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.058 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.058 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.058 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.058 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.058 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.058 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.058 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.058 10:54:47 -- accel/accel.sh@21 -- # val= 00:07:49.059 10:54:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:49.059 10:54:47 -- accel/accel.sh@20 -- # IFS=: 00:07:49.059 10:54:47 -- accel/accel.sh@20 -- # read -r var val 00:07:49.059 10:54:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:49.059 10:54:47 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:49.059 10:54:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.059 00:07:49.059 real 0m2.586s 00:07:49.059 user 0m2.345s 00:07:49.059 sys 0m0.252s 00:07:49.059 10:54:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:49.059 10:54:47 -- common/autotest_common.sh@10 -- # set +x 00:07:49.059 ************************************ 00:07:49.059 END TEST accel_decomp_mthread 00:07:49.059 ************************************ 00:07:49.059 10:54:47 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:49.059 10:54:47 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:49.059 10:54:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:49.059 10:54:47 -- common/autotest_common.sh@10 -- # set +x 00:07:49.059 ************************************ 00:07:49.059 START TEST accel_deomp_full_mthread 00:07:49.059 ************************************ 00:07:49.059 10:54:47 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:49.059 10:54:47 -- accel/accel.sh@16 -- # local accel_opc 00:07:49.059 10:54:47 -- accel/accel.sh@17 -- # local accel_module 00:07:49.059 10:54:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:49.059 10:54:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:49.059 10:54:47 -- accel/accel.sh@12 -- # build_accel_config 00:07:49.059 10:54:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:49.059 10:54:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.059 10:54:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.059 10:54:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:49.059 10:54:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:49.059 10:54:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:49.059 10:54:47 -- accel/accel.sh@42 -- # jq -r . 00:07:49.059 [2024-12-16 10:54:47.315619] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:49.059 [2024-12-16 10:54:47.315670] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid649375 ] 00:07:49.059 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.059 [2024-12-16 10:54:47.377641] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.059 [2024-12-16 10:54:47.412440] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.998 10:54:48 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:49.998 00:07:49.998 SPDK Configuration: 00:07:49.998 Core mask: 0x1 00:07:49.998 00:07:49.998 Accel Perf Configuration: 00:07:49.998 Workload Type: decompress 00:07:49.998 Transfer size: 111250 bytes 00:07:49.998 Vector count 1 00:07:49.998 Module: software 00:07:49.998 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:49.998 Queue depth: 32 00:07:49.998 Allocate depth: 32 00:07:49.998 # threads/core: 2 00:07:49.998 Run time: 1 seconds 00:07:49.998 Verify: Yes 00:07:49.998 00:07:49.998 Running for 1 seconds... 00:07:49.998 00:07:49.998 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:49.998 ------------------------------------------------------------------------------------ 00:07:49.998 0,1 2976/s 122 MiB/s 0 0 00:07:49.998 0,0 2944/s 121 MiB/s 0 0 00:07:49.998 ==================================================================================== 00:07:49.998 Total 5920/s 628 MiB/s 0 0' 00:07:49.998 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:49.998 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:49.998 10:54:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:49.998 10:54:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:49.998 10:54:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:49.998 10:54:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:49.998 10:54:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.998 10:54:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.998 10:54:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:49.998 10:54:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:49.998 10:54:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:49.998 10:54:48 -- accel/accel.sh@42 -- # jq -r . 00:07:49.998 [2024-12-16 10:54:48.614143] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:49.998 [2024-12-16 10:54:48.614245] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid649628 ] 00:07:50.258 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.258 [2024-12-16 10:54:48.680446] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.258 [2024-12-16 10:54:48.714378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=0x1 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=decompress 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=software 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@23 -- # accel_module=software 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=32 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=32 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=2 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val=Yes 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:50.258 10:54:48 -- accel/accel.sh@21 -- # val= 00:07:50.258 10:54:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # IFS=: 00:07:50.258 10:54:48 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@21 -- # val= 00:07:51.640 10:54:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # IFS=: 00:07:51.640 10:54:49 -- accel/accel.sh@20 -- # read -r var val 00:07:51.640 10:54:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:51.640 10:54:49 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:51.640 10:54:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.640 00:07:51.640 real 0m2.597s 00:07:51.640 user 0m2.365s 00:07:51.640 sys 0m0.237s 00:07:51.640 10:54:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:51.640 10:54:49 -- common/autotest_common.sh@10 -- # set +x 00:07:51.640 ************************************ 00:07:51.640 END TEST accel_deomp_full_mthread 00:07:51.640 ************************************ 00:07:51.640 10:54:49 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:51.640 10:54:49 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:51.640 10:54:49 -- accel/accel.sh@129 -- # build_accel_config 00:07:51.640 10:54:49 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:51.640 10:54:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:51.640 10:54:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:51.640 10:54:49 -- common/autotest_common.sh@10 -- # set +x 00:07:51.640 10:54:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.640 10:54:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.640 10:54:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:51.640 10:54:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:51.640 10:54:49 -- accel/accel.sh@41 -- # local IFS=, 00:07:51.640 10:54:49 -- accel/accel.sh@42 -- # jq -r . 00:07:51.640 ************************************ 00:07:51.640 START TEST accel_dif_functional_tests 00:07:51.640 ************************************ 00:07:51.640 10:54:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:51.640 [2024-12-16 10:54:49.957929] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:51.640 [2024-12-16 10:54:49.957980] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid649912 ] 00:07:51.640 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.640 [2024-12-16 10:54:50.020201] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:51.640 [2024-12-16 10:54:50.058738] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:51.640 [2024-12-16 10:54:50.058835] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.640 [2024-12-16 10:54:50.058836] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:51.640 00:07:51.640 00:07:51.640 CUnit - A unit testing framework for C - Version 2.1-3 00:07:51.640 http://cunit.sourceforge.net/ 00:07:51.640 00:07:51.640 00:07:51.640 Suite: accel_dif 00:07:51.640 Test: verify: DIF generated, GUARD check ...passed 00:07:51.640 Test: verify: DIF generated, APPTAG check ...passed 00:07:51.640 Test: verify: DIF generated, REFTAG check ...passed 00:07:51.640 Test: verify: DIF not generated, GUARD check ...[2024-12-16 10:54:50.122036] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:51.640 [2024-12-16 10:54:50.122087] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:51.640 passed 00:07:51.640 Test: verify: DIF not generated, APPTAG check ...[2024-12-16 10:54:50.122120] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:51.640 [2024-12-16 10:54:50.122139] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:51.640 passed 00:07:51.640 Test: verify: DIF not generated, REFTAG check ...[2024-12-16 10:54:50.122161] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:51.640 [2024-12-16 10:54:50.122180] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:51.640 passed 00:07:51.640 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:51.640 Test: verify: APPTAG incorrect, APPTAG check ...[2024-12-16 10:54:50.122226] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:51.640 passed 00:07:51.640 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:51.640 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:51.640 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:51.640 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-12-16 10:54:50.122330] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:51.640 passed 00:07:51.640 Test: generate copy: DIF generated, GUARD check ...passed 00:07:51.640 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:51.640 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:51.640 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:51.640 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:51.640 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:51.640 Test: generate copy: iovecs-len validate ...[2024-12-16 10:54:50.122505] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:51.640 passed 00:07:51.640 Test: generate copy: buffer alignment validate ...passed 00:07:51.640 00:07:51.640 Run Summary: Type Total Ran Passed Failed Inactive 00:07:51.640 suites 1 1 n/a 0 0 00:07:51.640 tests 20 20 20 0 0 00:07:51.640 asserts 204 204 204 0 n/a 00:07:51.640 00:07:51.640 Elapsed time = 0.000 seconds 00:07:51.900 00:07:51.900 real 0m0.327s 00:07:51.900 user 0m0.527s 00:07:51.900 sys 0m0.143s 00:07:51.900 10:54:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:51.900 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:51.900 ************************************ 00:07:51.900 END TEST accel_dif_functional_tests 00:07:51.900 ************************************ 00:07:51.900 00:07:51.900 real 0m55.202s 00:07:51.900 user 1m2.938s 00:07:51.900 sys 0m6.995s 00:07:51.900 10:54:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:51.900 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:51.900 ************************************ 00:07:51.900 END TEST accel 00:07:51.900 ************************************ 00:07:51.900 10:54:50 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:51.900 10:54:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:51.900 10:54:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:51.900 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:51.900 ************************************ 00:07:51.900 START TEST accel_rpc 00:07:51.900 ************************************ 00:07:51.900 10:54:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:51.900 * Looking for test storage... 00:07:51.900 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:51.900 10:54:50 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:51.900 10:54:50 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:51.900 10:54:50 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:52.160 10:54:50 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:52.160 10:54:50 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:52.160 10:54:50 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:52.160 10:54:50 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:52.160 10:54:50 -- scripts/common.sh@335 -- # IFS=.-: 00:07:52.160 10:54:50 -- scripts/common.sh@335 -- # read -ra ver1 00:07:52.160 10:54:50 -- scripts/common.sh@336 -- # IFS=.-: 00:07:52.160 10:54:50 -- scripts/common.sh@336 -- # read -ra ver2 00:07:52.160 10:54:50 -- scripts/common.sh@337 -- # local 'op=<' 00:07:52.160 10:54:50 -- scripts/common.sh@339 -- # ver1_l=2 00:07:52.160 10:54:50 -- scripts/common.sh@340 -- # ver2_l=1 00:07:52.160 10:54:50 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:52.160 10:54:50 -- scripts/common.sh@343 -- # case "$op" in 00:07:52.160 10:54:50 -- scripts/common.sh@344 -- # : 1 00:07:52.160 10:54:50 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:52.160 10:54:50 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:52.160 10:54:50 -- scripts/common.sh@364 -- # decimal 1 00:07:52.160 10:54:50 -- scripts/common.sh@352 -- # local d=1 00:07:52.160 10:54:50 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:52.160 10:54:50 -- scripts/common.sh@354 -- # echo 1 00:07:52.160 10:54:50 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:52.160 10:54:50 -- scripts/common.sh@365 -- # decimal 2 00:07:52.160 10:54:50 -- scripts/common.sh@352 -- # local d=2 00:07:52.160 10:54:50 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:52.160 10:54:50 -- scripts/common.sh@354 -- # echo 2 00:07:52.160 10:54:50 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:52.160 10:54:50 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:52.160 10:54:50 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:52.160 10:54:50 -- scripts/common.sh@367 -- # return 0 00:07:52.160 10:54:50 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:52.160 10:54:50 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:52.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:52.160 --rc genhtml_branch_coverage=1 00:07:52.160 --rc genhtml_function_coverage=1 00:07:52.160 --rc genhtml_legend=1 00:07:52.160 --rc geninfo_all_blocks=1 00:07:52.160 --rc geninfo_unexecuted_blocks=1 00:07:52.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:52.160 ' 00:07:52.160 10:54:50 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:52.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:52.160 --rc genhtml_branch_coverage=1 00:07:52.160 --rc genhtml_function_coverage=1 00:07:52.160 --rc genhtml_legend=1 00:07:52.160 --rc geninfo_all_blocks=1 00:07:52.160 --rc geninfo_unexecuted_blocks=1 00:07:52.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:52.160 ' 00:07:52.160 10:54:50 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:52.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:52.160 --rc genhtml_branch_coverage=1 00:07:52.160 --rc genhtml_function_coverage=1 00:07:52.160 --rc genhtml_legend=1 00:07:52.160 --rc geninfo_all_blocks=1 00:07:52.160 --rc geninfo_unexecuted_blocks=1 00:07:52.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:52.160 ' 00:07:52.160 10:54:50 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:52.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:52.160 --rc genhtml_branch_coverage=1 00:07:52.160 --rc genhtml_function_coverage=1 00:07:52.160 --rc genhtml_legend=1 00:07:52.160 --rc geninfo_all_blocks=1 00:07:52.160 --rc geninfo_unexecuted_blocks=1 00:07:52.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:52.160 ' 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=650079 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@15 -- # waitforlisten 650079 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:52.160 10:54:50 -- common/autotest_common.sh@829 -- # '[' -z 650079 ']' 00:07:52.160 10:54:50 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:52.160 10:54:50 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:52.160 10:54:50 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:52.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:52.160 10:54:50 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:52.160 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.160 [2024-12-16 10:54:50.574926] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:52.160 [2024-12-16 10:54:50.575011] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid650079 ] 00:07:52.160 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.160 [2024-12-16 10:54:50.645312] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.160 [2024-12-16 10:54:50.681713] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.160 [2024-12-16 10:54:50.681826] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.160 10:54:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:52.160 10:54:50 -- common/autotest_common.sh@862 -- # return 0 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:52.160 10:54:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:52.160 10:54:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:52.160 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.160 ************************************ 00:07:52.160 START TEST accel_assign_opcode 00:07:52.160 ************************************ 00:07:52.160 10:54:50 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:52.160 10:54:50 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:52.160 10:54:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:52.160 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.161 [2024-12-16 10:54:50.754320] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:52.161 10:54:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:52.161 10:54:50 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:52.161 10:54:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:52.161 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.161 [2024-12-16 10:54:50.762332] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:52.161 10:54:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:52.161 10:54:50 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:52.161 10:54:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:52.161 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.420 10:54:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:52.420 10:54:50 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:52.420 10:54:50 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:52.420 10:54:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:52.420 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.420 10:54:50 -- accel/accel_rpc.sh@42 -- # grep software 00:07:52.420 10:54:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:52.420 software 00:07:52.420 00:07:52.420 real 0m0.221s 00:07:52.420 user 0m0.047s 00:07:52.420 sys 0m0.012s 00:07:52.420 10:54:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:52.420 10:54:50 -- common/autotest_common.sh@10 -- # set +x 00:07:52.420 ************************************ 00:07:52.420 END TEST accel_assign_opcode 00:07:52.420 ************************************ 00:07:52.420 10:54:51 -- accel/accel_rpc.sh@55 -- # killprocess 650079 00:07:52.420 10:54:51 -- common/autotest_common.sh@936 -- # '[' -z 650079 ']' 00:07:52.420 10:54:51 -- common/autotest_common.sh@940 -- # kill -0 650079 00:07:52.420 10:54:51 -- common/autotest_common.sh@941 -- # uname 00:07:52.420 10:54:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:52.420 10:54:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 650079 00:07:52.680 10:54:51 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:52.680 10:54:51 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:52.680 10:54:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 650079' 00:07:52.680 killing process with pid 650079 00:07:52.680 10:54:51 -- common/autotest_common.sh@955 -- # kill 650079 00:07:52.680 10:54:51 -- common/autotest_common.sh@960 -- # wait 650079 00:07:52.939 00:07:52.939 real 0m1.004s 00:07:52.939 user 0m0.918s 00:07:52.939 sys 0m0.450s 00:07:52.939 10:54:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:52.939 10:54:51 -- common/autotest_common.sh@10 -- # set +x 00:07:52.939 ************************************ 00:07:52.939 END TEST accel_rpc 00:07:52.939 ************************************ 00:07:52.939 10:54:51 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:52.939 10:54:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:52.939 10:54:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:52.939 10:54:51 -- common/autotest_common.sh@10 -- # set +x 00:07:52.939 ************************************ 00:07:52.939 START TEST app_cmdline 00:07:52.939 ************************************ 00:07:52.939 10:54:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:52.939 * Looking for test storage... 00:07:52.939 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:52.939 10:54:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:52.939 10:54:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:52.939 10:54:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:53.199 10:54:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:53.199 10:54:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:53.199 10:54:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:53.199 10:54:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:53.199 10:54:51 -- scripts/common.sh@335 -- # IFS=.-: 00:07:53.199 10:54:51 -- scripts/common.sh@335 -- # read -ra ver1 00:07:53.199 10:54:51 -- scripts/common.sh@336 -- # IFS=.-: 00:07:53.199 10:54:51 -- scripts/common.sh@336 -- # read -ra ver2 00:07:53.199 10:54:51 -- scripts/common.sh@337 -- # local 'op=<' 00:07:53.199 10:54:51 -- scripts/common.sh@339 -- # ver1_l=2 00:07:53.199 10:54:51 -- scripts/common.sh@340 -- # ver2_l=1 00:07:53.199 10:54:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:53.199 10:54:51 -- scripts/common.sh@343 -- # case "$op" in 00:07:53.199 10:54:51 -- scripts/common.sh@344 -- # : 1 00:07:53.199 10:54:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:53.199 10:54:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:53.199 10:54:51 -- scripts/common.sh@364 -- # decimal 1 00:07:53.199 10:54:51 -- scripts/common.sh@352 -- # local d=1 00:07:53.199 10:54:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:53.199 10:54:51 -- scripts/common.sh@354 -- # echo 1 00:07:53.199 10:54:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:53.199 10:54:51 -- scripts/common.sh@365 -- # decimal 2 00:07:53.199 10:54:51 -- scripts/common.sh@352 -- # local d=2 00:07:53.199 10:54:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:53.199 10:54:51 -- scripts/common.sh@354 -- # echo 2 00:07:53.199 10:54:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:53.199 10:54:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:53.199 10:54:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:53.199 10:54:51 -- scripts/common.sh@367 -- # return 0 00:07:53.199 10:54:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:53.199 10:54:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:53.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.199 --rc genhtml_branch_coverage=1 00:07:53.199 --rc genhtml_function_coverage=1 00:07:53.199 --rc genhtml_legend=1 00:07:53.199 --rc geninfo_all_blocks=1 00:07:53.199 --rc geninfo_unexecuted_blocks=1 00:07:53.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.199 ' 00:07:53.199 10:54:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:53.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.199 --rc genhtml_branch_coverage=1 00:07:53.199 --rc genhtml_function_coverage=1 00:07:53.199 --rc genhtml_legend=1 00:07:53.199 --rc geninfo_all_blocks=1 00:07:53.199 --rc geninfo_unexecuted_blocks=1 00:07:53.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.199 ' 00:07:53.199 10:54:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:53.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.199 --rc genhtml_branch_coverage=1 00:07:53.199 --rc genhtml_function_coverage=1 00:07:53.199 --rc genhtml_legend=1 00:07:53.199 --rc geninfo_all_blocks=1 00:07:53.199 --rc geninfo_unexecuted_blocks=1 00:07:53.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.199 ' 00:07:53.199 10:54:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:53.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.199 --rc genhtml_branch_coverage=1 00:07:53.199 --rc genhtml_function_coverage=1 00:07:53.199 --rc genhtml_legend=1 00:07:53.199 --rc geninfo_all_blocks=1 00:07:53.199 --rc geninfo_unexecuted_blocks=1 00:07:53.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.199 ' 00:07:53.199 10:54:51 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:53.199 10:54:51 -- app/cmdline.sh@17 -- # spdk_tgt_pid=650328 00:07:53.199 10:54:51 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:53.199 10:54:51 -- app/cmdline.sh@18 -- # waitforlisten 650328 00:07:53.199 10:54:51 -- common/autotest_common.sh@829 -- # '[' -z 650328 ']' 00:07:53.199 10:54:51 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:53.199 10:54:51 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:53.199 10:54:51 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:53.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:53.199 10:54:51 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:53.199 10:54:51 -- common/autotest_common.sh@10 -- # set +x 00:07:53.199 [2024-12-16 10:54:51.614647] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:53.199 [2024-12-16 10:54:51.614714] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid650328 ] 00:07:53.199 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.199 [2024-12-16 10:54:51.681262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.199 [2024-12-16 10:54:51.716717] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.199 [2024-12-16 10:54:51.716831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.137 10:54:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:54.137 10:54:52 -- common/autotest_common.sh@862 -- # return 0 00:07:54.137 10:54:52 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:54.137 { 00:07:54.137 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:54.137 "fields": { 00:07:54.137 "major": 24, 00:07:54.137 "minor": 1, 00:07:54.137 "patch": 1, 00:07:54.137 "suffix": "-pre", 00:07:54.137 "commit": "c13c99a5e" 00:07:54.137 } 00:07:54.137 } 00:07:54.137 10:54:52 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:54.137 10:54:52 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:54.137 10:54:52 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:54.137 10:54:52 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:54.137 10:54:52 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:54.137 10:54:52 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:54.137 10:54:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:54.137 10:54:52 -- common/autotest_common.sh@10 -- # set +x 00:07:54.137 10:54:52 -- app/cmdline.sh@26 -- # sort 00:07:54.137 10:54:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:54.137 10:54:52 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:54.137 10:54:52 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:54.137 10:54:52 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:54.137 10:54:52 -- common/autotest_common.sh@650 -- # local es=0 00:07:54.137 10:54:52 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:54.137 10:54:52 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:54.137 10:54:52 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:54.137 10:54:52 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:54.137 10:54:52 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:54.137 10:54:52 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:54.137 10:54:52 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:54.137 10:54:52 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:54.137 10:54:52 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:54.137 10:54:52 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:54.396 request: 00:07:54.396 { 00:07:54.396 "method": "env_dpdk_get_mem_stats", 00:07:54.396 "req_id": 1 00:07:54.396 } 00:07:54.396 Got JSON-RPC error response 00:07:54.396 response: 00:07:54.396 { 00:07:54.396 "code": -32601, 00:07:54.396 "message": "Method not found" 00:07:54.396 } 00:07:54.397 10:54:52 -- common/autotest_common.sh@653 -- # es=1 00:07:54.397 10:54:52 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:54.397 10:54:52 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:54.397 10:54:52 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:54.397 10:54:52 -- app/cmdline.sh@1 -- # killprocess 650328 00:07:54.397 10:54:52 -- common/autotest_common.sh@936 -- # '[' -z 650328 ']' 00:07:54.397 10:54:52 -- common/autotest_common.sh@940 -- # kill -0 650328 00:07:54.397 10:54:52 -- common/autotest_common.sh@941 -- # uname 00:07:54.397 10:54:52 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:54.397 10:54:52 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 650328 00:07:54.397 10:54:52 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:54.397 10:54:52 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:54.397 10:54:52 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 650328' 00:07:54.397 killing process with pid 650328 00:07:54.397 10:54:52 -- common/autotest_common.sh@955 -- # kill 650328 00:07:54.397 10:54:52 -- common/autotest_common.sh@960 -- # wait 650328 00:07:54.656 00:07:54.656 real 0m1.789s 00:07:54.656 user 0m2.084s 00:07:54.656 sys 0m0.513s 00:07:54.656 10:54:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:54.656 10:54:53 -- common/autotest_common.sh@10 -- # set +x 00:07:54.656 ************************************ 00:07:54.656 END TEST app_cmdline 00:07:54.656 ************************************ 00:07:54.656 10:54:53 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:54.656 10:54:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:54.656 10:54:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:54.656 10:54:53 -- common/autotest_common.sh@10 -- # set +x 00:07:54.656 ************************************ 00:07:54.656 START TEST version 00:07:54.656 ************************************ 00:07:54.656 10:54:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:54.915 * Looking for test storage... 00:07:54.915 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:54.915 10:54:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:54.915 10:54:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:54.915 10:54:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:54.915 10:54:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:54.915 10:54:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:54.915 10:54:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:54.915 10:54:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:54.915 10:54:53 -- scripts/common.sh@335 -- # IFS=.-: 00:07:54.915 10:54:53 -- scripts/common.sh@335 -- # read -ra ver1 00:07:54.915 10:54:53 -- scripts/common.sh@336 -- # IFS=.-: 00:07:54.915 10:54:53 -- scripts/common.sh@336 -- # read -ra ver2 00:07:54.915 10:54:53 -- scripts/common.sh@337 -- # local 'op=<' 00:07:54.915 10:54:53 -- scripts/common.sh@339 -- # ver1_l=2 00:07:54.915 10:54:53 -- scripts/common.sh@340 -- # ver2_l=1 00:07:54.915 10:54:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:54.915 10:54:53 -- scripts/common.sh@343 -- # case "$op" in 00:07:54.915 10:54:53 -- scripts/common.sh@344 -- # : 1 00:07:54.915 10:54:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:54.915 10:54:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:54.915 10:54:53 -- scripts/common.sh@364 -- # decimal 1 00:07:54.915 10:54:53 -- scripts/common.sh@352 -- # local d=1 00:07:54.915 10:54:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:54.915 10:54:53 -- scripts/common.sh@354 -- # echo 1 00:07:54.915 10:54:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:54.915 10:54:53 -- scripts/common.sh@365 -- # decimal 2 00:07:54.915 10:54:53 -- scripts/common.sh@352 -- # local d=2 00:07:54.915 10:54:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:54.916 10:54:53 -- scripts/common.sh@354 -- # echo 2 00:07:54.916 10:54:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:54.916 10:54:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:54.916 10:54:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:54.916 10:54:53 -- scripts/common.sh@367 -- # return 0 00:07:54.916 10:54:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:54.916 10:54:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:54.916 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.916 --rc genhtml_branch_coverage=1 00:07:54.916 --rc genhtml_function_coverage=1 00:07:54.916 --rc genhtml_legend=1 00:07:54.916 --rc geninfo_all_blocks=1 00:07:54.916 --rc geninfo_unexecuted_blocks=1 00:07:54.916 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.916 ' 00:07:54.916 10:54:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:54.916 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.916 --rc genhtml_branch_coverage=1 00:07:54.916 --rc genhtml_function_coverage=1 00:07:54.916 --rc genhtml_legend=1 00:07:54.916 --rc geninfo_all_blocks=1 00:07:54.916 --rc geninfo_unexecuted_blocks=1 00:07:54.916 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.916 ' 00:07:54.916 10:54:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:54.916 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.916 --rc genhtml_branch_coverage=1 00:07:54.916 --rc genhtml_function_coverage=1 00:07:54.916 --rc genhtml_legend=1 00:07:54.916 --rc geninfo_all_blocks=1 00:07:54.916 --rc geninfo_unexecuted_blocks=1 00:07:54.916 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.916 ' 00:07:54.916 10:54:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:54.916 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.916 --rc genhtml_branch_coverage=1 00:07:54.916 --rc genhtml_function_coverage=1 00:07:54.916 --rc genhtml_legend=1 00:07:54.916 --rc geninfo_all_blocks=1 00:07:54.916 --rc geninfo_unexecuted_blocks=1 00:07:54.916 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.916 ' 00:07:54.916 10:54:53 -- app/version.sh@17 -- # get_header_version major 00:07:54.916 10:54:53 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:54.916 10:54:53 -- app/version.sh@14 -- # cut -f2 00:07:54.916 10:54:53 -- app/version.sh@14 -- # tr -d '"' 00:07:54.916 10:54:53 -- app/version.sh@17 -- # major=24 00:07:54.916 10:54:53 -- app/version.sh@18 -- # get_header_version minor 00:07:54.916 10:54:53 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:54.916 10:54:53 -- app/version.sh@14 -- # cut -f2 00:07:54.916 10:54:53 -- app/version.sh@14 -- # tr -d '"' 00:07:54.916 10:54:53 -- app/version.sh@18 -- # minor=1 00:07:54.916 10:54:53 -- app/version.sh@19 -- # get_header_version patch 00:07:54.916 10:54:53 -- app/version.sh@14 -- # cut -f2 00:07:54.916 10:54:53 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:54.916 10:54:53 -- app/version.sh@14 -- # tr -d '"' 00:07:54.916 10:54:53 -- app/version.sh@19 -- # patch=1 00:07:54.916 10:54:53 -- app/version.sh@20 -- # get_header_version suffix 00:07:54.916 10:54:53 -- app/version.sh@14 -- # cut -f2 00:07:54.916 10:54:53 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:54.916 10:54:53 -- app/version.sh@14 -- # tr -d '"' 00:07:54.916 10:54:53 -- app/version.sh@20 -- # suffix=-pre 00:07:54.916 10:54:53 -- app/version.sh@22 -- # version=24.1 00:07:54.916 10:54:53 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:54.916 10:54:53 -- app/version.sh@25 -- # version=24.1.1 00:07:54.916 10:54:53 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:54.916 10:54:53 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:54.916 10:54:53 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:54.916 10:54:53 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:54.916 10:54:53 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:54.916 00:07:54.916 real 0m0.262s 00:07:54.916 user 0m0.158s 00:07:54.916 sys 0m0.152s 00:07:54.916 10:54:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:54.916 10:54:53 -- common/autotest_common.sh@10 -- # set +x 00:07:54.916 ************************************ 00:07:54.916 END TEST version 00:07:54.916 ************************************ 00:07:54.916 10:54:53 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@191 -- # uname -s 00:07:55.176 10:54:53 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:55.176 10:54:53 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:55.176 10:54:53 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:55.176 10:54:53 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:55.176 10:54:53 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:55.176 10:54:53 -- common/autotest_common.sh@10 -- # set +x 00:07:55.176 10:54:53 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:55.176 10:54:53 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:55.176 10:54:53 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:55.176 10:54:53 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:55.176 10:54:53 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:55.176 10:54:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:55.176 10:54:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:55.176 10:54:53 -- common/autotest_common.sh@10 -- # set +x 00:07:55.176 ************************************ 00:07:55.176 START TEST llvm_fuzz 00:07:55.176 ************************************ 00:07:55.176 10:54:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:55.176 * Looking for test storage... 00:07:55.176 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:55.176 10:54:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:55.176 10:54:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:55.176 10:54:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:55.176 10:54:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:55.176 10:54:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:55.176 10:54:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:55.176 10:54:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:55.176 10:54:53 -- scripts/common.sh@335 -- # IFS=.-: 00:07:55.176 10:54:53 -- scripts/common.sh@335 -- # read -ra ver1 00:07:55.176 10:54:53 -- scripts/common.sh@336 -- # IFS=.-: 00:07:55.176 10:54:53 -- scripts/common.sh@336 -- # read -ra ver2 00:07:55.176 10:54:53 -- scripts/common.sh@337 -- # local 'op=<' 00:07:55.176 10:54:53 -- scripts/common.sh@339 -- # ver1_l=2 00:07:55.176 10:54:53 -- scripts/common.sh@340 -- # ver2_l=1 00:07:55.176 10:54:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:55.176 10:54:53 -- scripts/common.sh@343 -- # case "$op" in 00:07:55.176 10:54:53 -- scripts/common.sh@344 -- # : 1 00:07:55.176 10:54:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:55.176 10:54:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:55.176 10:54:53 -- scripts/common.sh@364 -- # decimal 1 00:07:55.176 10:54:53 -- scripts/common.sh@352 -- # local d=1 00:07:55.176 10:54:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:55.176 10:54:53 -- scripts/common.sh@354 -- # echo 1 00:07:55.176 10:54:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:55.176 10:54:53 -- scripts/common.sh@365 -- # decimal 2 00:07:55.176 10:54:53 -- scripts/common.sh@352 -- # local d=2 00:07:55.176 10:54:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:55.176 10:54:53 -- scripts/common.sh@354 -- # echo 2 00:07:55.176 10:54:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:55.176 10:54:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:55.176 10:54:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:55.176 10:54:53 -- scripts/common.sh@367 -- # return 0 00:07:55.176 10:54:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:55.176 10:54:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:55.176 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.176 --rc genhtml_branch_coverage=1 00:07:55.176 --rc genhtml_function_coverage=1 00:07:55.176 --rc genhtml_legend=1 00:07:55.176 --rc geninfo_all_blocks=1 00:07:55.176 --rc geninfo_unexecuted_blocks=1 00:07:55.176 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.176 ' 00:07:55.176 10:54:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:55.176 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.176 --rc genhtml_branch_coverage=1 00:07:55.176 --rc genhtml_function_coverage=1 00:07:55.176 --rc genhtml_legend=1 00:07:55.176 --rc geninfo_all_blocks=1 00:07:55.176 --rc geninfo_unexecuted_blocks=1 00:07:55.176 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.176 ' 00:07:55.176 10:54:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:55.176 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.176 --rc genhtml_branch_coverage=1 00:07:55.176 --rc genhtml_function_coverage=1 00:07:55.176 --rc genhtml_legend=1 00:07:55.176 --rc geninfo_all_blocks=1 00:07:55.177 --rc geninfo_unexecuted_blocks=1 00:07:55.177 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.177 ' 00:07:55.177 10:54:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:55.177 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.177 --rc genhtml_branch_coverage=1 00:07:55.177 --rc genhtml_function_coverage=1 00:07:55.177 --rc genhtml_legend=1 00:07:55.177 --rc geninfo_all_blocks=1 00:07:55.177 --rc geninfo_unexecuted_blocks=1 00:07:55.177 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.177 ' 00:07:55.177 10:54:53 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:55.177 10:54:53 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:55.177 10:54:53 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:55.177 10:54:53 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:55.177 10:54:53 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:55.177 10:54:53 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:55.177 10:54:53 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:55.177 10:54:53 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:55.177 10:54:53 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:55.177 10:54:53 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:55.177 10:54:53 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:55.177 10:54:53 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:55.177 10:54:53 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:55.177 10:54:53 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:55.177 10:54:53 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:55.177 10:54:53 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:55.177 10:54:53 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:55.177 10:54:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:55.177 10:54:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:55.177 10:54:53 -- common/autotest_common.sh@10 -- # set +x 00:07:55.177 ************************************ 00:07:55.177 START TEST nvmf_fuzz 00:07:55.177 ************************************ 00:07:55.177 10:54:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:55.438 * Looking for test storage... 00:07:55.438 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.438 10:54:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:55.438 10:54:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:55.438 10:54:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:55.438 10:54:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:55.438 10:54:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:55.438 10:54:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:55.438 10:54:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:55.438 10:54:53 -- scripts/common.sh@335 -- # IFS=.-: 00:07:55.438 10:54:53 -- scripts/common.sh@335 -- # read -ra ver1 00:07:55.438 10:54:53 -- scripts/common.sh@336 -- # IFS=.-: 00:07:55.438 10:54:53 -- scripts/common.sh@336 -- # read -ra ver2 00:07:55.438 10:54:53 -- scripts/common.sh@337 -- # local 'op=<' 00:07:55.438 10:54:53 -- scripts/common.sh@339 -- # ver1_l=2 00:07:55.438 10:54:53 -- scripts/common.sh@340 -- # ver2_l=1 00:07:55.438 10:54:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:55.438 10:54:53 -- scripts/common.sh@343 -- # case "$op" in 00:07:55.438 10:54:53 -- scripts/common.sh@344 -- # : 1 00:07:55.438 10:54:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:55.438 10:54:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:55.438 10:54:53 -- scripts/common.sh@364 -- # decimal 1 00:07:55.438 10:54:53 -- scripts/common.sh@352 -- # local d=1 00:07:55.438 10:54:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:55.438 10:54:53 -- scripts/common.sh@354 -- # echo 1 00:07:55.438 10:54:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:55.438 10:54:53 -- scripts/common.sh@365 -- # decimal 2 00:07:55.439 10:54:53 -- scripts/common.sh@352 -- # local d=2 00:07:55.439 10:54:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:55.439 10:54:53 -- scripts/common.sh@354 -- # echo 2 00:07:55.439 10:54:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:55.439 10:54:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:55.439 10:54:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:55.439 10:54:53 -- scripts/common.sh@367 -- # return 0 00:07:55.439 10:54:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:55.439 10:54:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:55.439 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.439 --rc genhtml_branch_coverage=1 00:07:55.439 --rc genhtml_function_coverage=1 00:07:55.439 --rc genhtml_legend=1 00:07:55.439 --rc geninfo_all_blocks=1 00:07:55.439 --rc geninfo_unexecuted_blocks=1 00:07:55.439 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.439 ' 00:07:55.439 10:54:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:55.439 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.439 --rc genhtml_branch_coverage=1 00:07:55.439 --rc genhtml_function_coverage=1 00:07:55.439 --rc genhtml_legend=1 00:07:55.439 --rc geninfo_all_blocks=1 00:07:55.439 --rc geninfo_unexecuted_blocks=1 00:07:55.439 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.439 ' 00:07:55.439 10:54:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:55.439 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.439 --rc genhtml_branch_coverage=1 00:07:55.439 --rc genhtml_function_coverage=1 00:07:55.439 --rc genhtml_legend=1 00:07:55.439 --rc geninfo_all_blocks=1 00:07:55.439 --rc geninfo_unexecuted_blocks=1 00:07:55.439 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.439 ' 00:07:55.439 10:54:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:55.439 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.439 --rc genhtml_branch_coverage=1 00:07:55.439 --rc genhtml_function_coverage=1 00:07:55.439 --rc genhtml_legend=1 00:07:55.439 --rc geninfo_all_blocks=1 00:07:55.439 --rc geninfo_unexecuted_blocks=1 00:07:55.439 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.439 ' 00:07:55.439 10:54:53 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:55.439 10:54:53 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:55.439 10:54:53 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:55.439 10:54:53 -- common/autotest_common.sh@34 -- # set -e 00:07:55.439 10:54:53 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:55.439 10:54:53 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:55.439 10:54:53 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:55.439 10:54:53 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:55.439 10:54:53 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:55.439 10:54:53 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:55.439 10:54:53 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:55.439 10:54:53 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:55.439 10:54:53 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:55.439 10:54:53 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:55.439 10:54:53 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:55.439 10:54:53 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:55.439 10:54:53 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:55.439 10:54:53 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:55.439 10:54:53 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:55.439 10:54:53 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:55.439 10:54:53 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:55.439 10:54:53 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:55.439 10:54:53 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:55.439 10:54:53 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:55.439 10:54:53 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:55.439 10:54:53 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:55.439 10:54:53 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:55.439 10:54:53 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:55.439 10:54:53 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:55.439 10:54:53 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:55.439 10:54:53 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:55.439 10:54:53 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:55.439 10:54:53 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:55.439 10:54:53 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:55.439 10:54:53 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:55.439 10:54:53 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:55.439 10:54:53 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:55.439 10:54:53 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:55.439 10:54:53 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:55.439 10:54:53 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:55.439 10:54:53 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:55.439 10:54:53 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:55.439 10:54:53 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:55.439 10:54:53 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:55.439 10:54:53 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:55.439 10:54:53 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:55.439 10:54:53 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:55.439 10:54:53 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:55.439 10:54:53 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:55.439 10:54:53 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:55.439 10:54:53 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:55.439 10:54:53 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:55.439 10:54:53 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:55.439 10:54:53 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:55.439 10:54:53 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:55.439 10:54:53 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:55.439 10:54:53 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:55.439 10:54:53 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:55.439 10:54:53 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:55.439 10:54:53 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:55.439 10:54:53 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:55.439 10:54:53 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:55.439 10:54:53 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:55.439 10:54:53 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:55.439 10:54:53 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:55.439 10:54:53 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:55.439 10:54:53 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:55.439 10:54:53 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:55.439 10:54:53 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.439 10:54:53 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:55.439 10:54:53 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:55.439 10:54:53 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:55.439 10:54:53 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:55.439 10:54:53 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:55.439 10:54:53 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:55.439 10:54:53 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:55.439 10:54:53 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:55.439 10:54:53 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:55.439 10:54:53 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:55.439 10:54:53 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:55.439 10:54:53 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:55.439 10:54:53 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:55.439 10:54:53 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:55.439 10:54:53 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:55.439 10:54:53 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:55.439 10:54:53 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:55.439 10:54:53 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:55.439 10:54:53 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:55.439 10:54:53 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:55.439 10:54:53 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:55.439 10:54:53 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:55.439 10:54:53 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:55.439 10:54:53 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:55.439 10:54:53 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:55.439 10:54:53 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:55.439 10:54:53 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:55.439 10:54:53 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:55.439 10:54:53 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:55.439 10:54:53 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:55.439 10:54:53 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:55.439 10:54:53 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:55.439 10:54:53 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:55.439 10:54:53 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:55.439 #define SPDK_CONFIG_H 00:07:55.439 #define SPDK_CONFIG_APPS 1 00:07:55.440 #define SPDK_CONFIG_ARCH native 00:07:55.440 #undef SPDK_CONFIG_ASAN 00:07:55.440 #undef SPDK_CONFIG_AVAHI 00:07:55.440 #undef SPDK_CONFIG_CET 00:07:55.440 #define SPDK_CONFIG_COVERAGE 1 00:07:55.440 #define SPDK_CONFIG_CROSS_PREFIX 00:07:55.440 #undef SPDK_CONFIG_CRYPTO 00:07:55.440 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:55.440 #undef SPDK_CONFIG_CUSTOMOCF 00:07:55.440 #undef SPDK_CONFIG_DAOS 00:07:55.440 #define SPDK_CONFIG_DAOS_DIR 00:07:55.440 #define SPDK_CONFIG_DEBUG 1 00:07:55.440 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:55.440 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:55.440 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:55.440 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.440 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:55.440 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:55.440 #define SPDK_CONFIG_EXAMPLES 1 00:07:55.440 #undef SPDK_CONFIG_FC 00:07:55.440 #define SPDK_CONFIG_FC_PATH 00:07:55.440 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:55.440 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:55.440 #undef SPDK_CONFIG_FUSE 00:07:55.440 #define SPDK_CONFIG_FUZZER 1 00:07:55.440 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:55.440 #undef SPDK_CONFIG_GOLANG 00:07:55.440 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:55.440 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:55.440 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:55.440 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:55.440 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:55.440 #define SPDK_CONFIG_IDXD 1 00:07:55.440 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:55.440 #undef SPDK_CONFIG_IPSEC_MB 00:07:55.440 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:55.440 #define SPDK_CONFIG_ISAL 1 00:07:55.440 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:55.440 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:55.440 #define SPDK_CONFIG_LIBDIR 00:07:55.440 #undef SPDK_CONFIG_LTO 00:07:55.440 #define SPDK_CONFIG_MAX_LCORES 00:07:55.440 #define SPDK_CONFIG_NVME_CUSE 1 00:07:55.440 #undef SPDK_CONFIG_OCF 00:07:55.440 #define SPDK_CONFIG_OCF_PATH 00:07:55.440 #define SPDK_CONFIG_OPENSSL_PATH 00:07:55.440 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:55.440 #undef SPDK_CONFIG_PGO_USE 00:07:55.440 #define SPDK_CONFIG_PREFIX /usr/local 00:07:55.440 #undef SPDK_CONFIG_RAID5F 00:07:55.440 #undef SPDK_CONFIG_RBD 00:07:55.440 #define SPDK_CONFIG_RDMA 1 00:07:55.440 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:55.440 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:55.440 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:55.440 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:55.440 #undef SPDK_CONFIG_SHARED 00:07:55.440 #undef SPDK_CONFIG_SMA 00:07:55.440 #define SPDK_CONFIG_TESTS 1 00:07:55.440 #undef SPDK_CONFIG_TSAN 00:07:55.440 #define SPDK_CONFIG_UBLK 1 00:07:55.440 #define SPDK_CONFIG_UBSAN 1 00:07:55.440 #undef SPDK_CONFIG_UNIT_TESTS 00:07:55.440 #undef SPDK_CONFIG_URING 00:07:55.440 #define SPDK_CONFIG_URING_PATH 00:07:55.440 #undef SPDK_CONFIG_URING_ZNS 00:07:55.440 #undef SPDK_CONFIG_USDT 00:07:55.440 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:55.440 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:55.440 #define SPDK_CONFIG_VFIO_USER 1 00:07:55.440 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:55.440 #define SPDK_CONFIG_VHOST 1 00:07:55.440 #define SPDK_CONFIG_VIRTIO 1 00:07:55.440 #undef SPDK_CONFIG_VTUNE 00:07:55.440 #define SPDK_CONFIG_VTUNE_DIR 00:07:55.440 #define SPDK_CONFIG_WERROR 1 00:07:55.440 #define SPDK_CONFIG_WPDK_DIR 00:07:55.440 #undef SPDK_CONFIG_XNVME 00:07:55.440 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:55.440 10:54:53 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:55.440 10:54:53 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:55.440 10:54:53 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:55.440 10:54:53 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:55.440 10:54:53 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:55.440 10:54:53 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.440 10:54:53 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.440 10:54:53 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.440 10:54:53 -- paths/export.sh@5 -- # export PATH 00:07:55.440 10:54:53 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.440 10:54:53 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:55.440 10:54:53 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:55.440 10:54:53 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:55.440 10:54:53 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:55.440 10:54:53 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:55.440 10:54:53 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:55.440 10:54:53 -- pm/common@16 -- # TEST_TAG=N/A 00:07:55.440 10:54:53 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:55.440 10:54:53 -- common/autotest_common.sh@52 -- # : 1 00:07:55.440 10:54:53 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:55.440 10:54:53 -- common/autotest_common.sh@56 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:55.440 10:54:53 -- common/autotest_common.sh@58 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:55.440 10:54:53 -- common/autotest_common.sh@60 -- # : 1 00:07:55.440 10:54:53 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:55.440 10:54:53 -- common/autotest_common.sh@62 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:55.440 10:54:53 -- common/autotest_common.sh@64 -- # : 00:07:55.440 10:54:53 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:55.440 10:54:53 -- common/autotest_common.sh@66 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:55.440 10:54:53 -- common/autotest_common.sh@68 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:55.440 10:54:53 -- common/autotest_common.sh@70 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:55.440 10:54:53 -- common/autotest_common.sh@72 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:55.440 10:54:53 -- common/autotest_common.sh@74 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:55.440 10:54:53 -- common/autotest_common.sh@76 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:55.440 10:54:53 -- common/autotest_common.sh@78 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:55.440 10:54:53 -- common/autotest_common.sh@80 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:55.440 10:54:53 -- common/autotest_common.sh@82 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:55.440 10:54:53 -- common/autotest_common.sh@84 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:55.440 10:54:53 -- common/autotest_common.sh@86 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:55.440 10:54:53 -- common/autotest_common.sh@88 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:55.440 10:54:53 -- common/autotest_common.sh@90 -- # : 0 00:07:55.440 10:54:53 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:55.440 10:54:53 -- common/autotest_common.sh@92 -- # : 1 00:07:55.440 10:54:53 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:55.440 10:54:53 -- common/autotest_common.sh@94 -- # : 1 00:07:55.440 10:54:54 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:55.440 10:54:54 -- common/autotest_common.sh@96 -- # : rdma 00:07:55.440 10:54:54 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:55.440 10:54:54 -- common/autotest_common.sh@98 -- # : 0 00:07:55.440 10:54:54 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:55.441 10:54:54 -- common/autotest_common.sh@100 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:55.441 10:54:54 -- common/autotest_common.sh@102 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:55.441 10:54:54 -- common/autotest_common.sh@104 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:55.441 10:54:54 -- common/autotest_common.sh@106 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:55.441 10:54:54 -- common/autotest_common.sh@108 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:55.441 10:54:54 -- common/autotest_common.sh@110 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:55.441 10:54:54 -- common/autotest_common.sh@112 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:55.441 10:54:54 -- common/autotest_common.sh@114 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:55.441 10:54:54 -- common/autotest_common.sh@116 -- # : 1 00:07:55.441 10:54:54 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:55.441 10:54:54 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:55.441 10:54:54 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:55.441 10:54:54 -- common/autotest_common.sh@120 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:55.441 10:54:54 -- common/autotest_common.sh@122 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:55.441 10:54:54 -- common/autotest_common.sh@124 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:55.441 10:54:54 -- common/autotest_common.sh@126 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:55.441 10:54:54 -- common/autotest_common.sh@128 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:55.441 10:54:54 -- common/autotest_common.sh@130 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:55.441 10:54:54 -- common/autotest_common.sh@132 -- # : v23.11 00:07:55.441 10:54:54 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:55.441 10:54:54 -- common/autotest_common.sh@134 -- # : true 00:07:55.441 10:54:54 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:55.441 10:54:54 -- common/autotest_common.sh@136 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:55.441 10:54:54 -- common/autotest_common.sh@138 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:55.441 10:54:54 -- common/autotest_common.sh@140 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:55.441 10:54:54 -- common/autotest_common.sh@142 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:55.441 10:54:54 -- common/autotest_common.sh@144 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:55.441 10:54:54 -- common/autotest_common.sh@146 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:55.441 10:54:54 -- common/autotest_common.sh@148 -- # : 00:07:55.441 10:54:54 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:55.441 10:54:54 -- common/autotest_common.sh@150 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:55.441 10:54:54 -- common/autotest_common.sh@152 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:55.441 10:54:54 -- common/autotest_common.sh@154 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:55.441 10:54:54 -- common/autotest_common.sh@156 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:55.441 10:54:54 -- common/autotest_common.sh@158 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:55.441 10:54:54 -- common/autotest_common.sh@160 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:55.441 10:54:54 -- common/autotest_common.sh@163 -- # : 00:07:55.441 10:54:54 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:55.441 10:54:54 -- common/autotest_common.sh@165 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:55.441 10:54:54 -- common/autotest_common.sh@167 -- # : 0 00:07:55.441 10:54:54 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:55.441 10:54:54 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.441 10:54:54 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:55.441 10:54:54 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:55.441 10:54:54 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:55.441 10:54:54 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:55.441 10:54:54 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:55.441 10:54:54 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:55.441 10:54:54 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:55.441 10:54:54 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:55.441 10:54:54 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:55.441 10:54:54 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:55.441 10:54:54 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:55.441 10:54:54 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:55.441 10:54:54 -- common/autotest_common.sh@196 -- # cat 00:07:55.441 10:54:54 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:55.441 10:54:54 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:55.441 10:54:54 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:55.441 10:54:54 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:55.441 10:54:54 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:55.441 10:54:54 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:55.441 10:54:54 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:55.441 10:54:54 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:55.441 10:54:54 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:55.441 10:54:54 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:55.441 10:54:54 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:55.441 10:54:54 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:55.441 10:54:54 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:55.441 10:54:54 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:55.441 10:54:54 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:55.441 10:54:54 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:55.441 10:54:54 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:55.442 10:54:54 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:55.442 10:54:54 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:55.442 10:54:54 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:55.442 10:54:54 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:55.442 10:54:54 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:55.442 10:54:54 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:55.442 10:54:54 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:55.442 10:54:54 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:55.442 10:54:54 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:55.442 10:54:54 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:55.442 10:54:54 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:55.442 10:54:54 -- common/autotest_common.sh@259 -- # valgrind= 00:07:55.442 10:54:54 -- common/autotest_common.sh@265 -- # uname -s 00:07:55.442 10:54:54 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:55.442 10:54:54 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:55.442 10:54:54 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:55.442 10:54:54 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:55.442 10:54:54 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:55.442 10:54:54 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:55.442 10:54:54 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:55.442 10:54:54 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:55.442 10:54:54 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:55.442 10:54:54 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:55.442 10:54:54 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:55.442 10:54:54 -- common/autotest_common.sh@319 -- # [[ -z 650974 ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@319 -- # kill -0 650974 00:07:55.442 10:54:54 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:55.442 10:54:54 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:55.442 10:54:54 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:55.442 10:54:54 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:55.442 10:54:54 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:55.442 10:54:54 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:55.442 10:54:54 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:55.442 10:54:54 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.f6F4dJ 00:07:55.442 10:54:54 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:55.442 10:54:54 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:55.442 10:54:54 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.f6F4dJ/tests/nvmf /tmp/spdk.f6F4dJ 00:07:55.442 10:54:54 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:55.442 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.442 10:54:54 -- common/autotest_common.sh@328 -- # df -T 00:07:55.442 10:54:54 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=785162240 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=4499267584 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=52802232320 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730586624 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=8928354304 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864035840 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865293312 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=30865104896 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865293312 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=188416 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:55.702 10:54:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:55.702 10:54:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:55.702 10:54:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:55.702 10:54:54 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:55.702 * Looking for test storage... 00:07:55.702 10:54:54 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:55.703 10:54:54 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:55.703 10:54:54 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.703 10:54:54 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:55.703 10:54:54 -- common/autotest_common.sh@373 -- # mount=/ 00:07:55.703 10:54:54 -- common/autotest_common.sh@375 -- # target_space=52802232320 00:07:55.703 10:54:54 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:55.703 10:54:54 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:55.703 10:54:54 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:55.703 10:54:54 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:55.703 10:54:54 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:55.703 10:54:54 -- common/autotest_common.sh@382 -- # new_size=11142946816 00:07:55.703 10:54:54 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:55.703 10:54:54 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.703 10:54:54 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.703 10:54:54 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.703 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.703 10:54:54 -- common/autotest_common.sh@390 -- # return 0 00:07:55.703 10:54:54 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:55.703 10:54:54 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:55.703 10:54:54 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:55.703 10:54:54 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:55.703 10:54:54 -- common/autotest_common.sh@1682 -- # true 00:07:55.703 10:54:54 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:55.703 10:54:54 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:55.703 10:54:54 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:55.703 10:54:54 -- common/autotest_common.sh@27 -- # exec 00:07:55.703 10:54:54 -- common/autotest_common.sh@29 -- # exec 00:07:55.703 10:54:54 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:55.703 10:54:54 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:55.703 10:54:54 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:55.703 10:54:54 -- common/autotest_common.sh@18 -- # set -x 00:07:55.703 10:54:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:55.703 10:54:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:55.703 10:54:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:55.703 10:54:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:55.703 10:54:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:55.703 10:54:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:55.703 10:54:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:55.703 10:54:54 -- scripts/common.sh@335 -- # IFS=.-: 00:07:55.703 10:54:54 -- scripts/common.sh@335 -- # read -ra ver1 00:07:55.703 10:54:54 -- scripts/common.sh@336 -- # IFS=.-: 00:07:55.703 10:54:54 -- scripts/common.sh@336 -- # read -ra ver2 00:07:55.703 10:54:54 -- scripts/common.sh@337 -- # local 'op=<' 00:07:55.703 10:54:54 -- scripts/common.sh@339 -- # ver1_l=2 00:07:55.703 10:54:54 -- scripts/common.sh@340 -- # ver2_l=1 00:07:55.703 10:54:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:55.703 10:54:54 -- scripts/common.sh@343 -- # case "$op" in 00:07:55.703 10:54:54 -- scripts/common.sh@344 -- # : 1 00:07:55.703 10:54:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:55.703 10:54:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:55.703 10:54:54 -- scripts/common.sh@364 -- # decimal 1 00:07:55.703 10:54:54 -- scripts/common.sh@352 -- # local d=1 00:07:55.703 10:54:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:55.703 10:54:54 -- scripts/common.sh@354 -- # echo 1 00:07:55.703 10:54:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:55.703 10:54:54 -- scripts/common.sh@365 -- # decimal 2 00:07:55.703 10:54:54 -- scripts/common.sh@352 -- # local d=2 00:07:55.703 10:54:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:55.703 10:54:54 -- scripts/common.sh@354 -- # echo 2 00:07:55.703 10:54:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:55.703 10:54:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:55.703 10:54:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:55.703 10:54:54 -- scripts/common.sh@367 -- # return 0 00:07:55.703 10:54:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:55.703 10:54:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:55.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.703 --rc genhtml_branch_coverage=1 00:07:55.703 --rc genhtml_function_coverage=1 00:07:55.703 --rc genhtml_legend=1 00:07:55.703 --rc geninfo_all_blocks=1 00:07:55.703 --rc geninfo_unexecuted_blocks=1 00:07:55.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.703 ' 00:07:55.703 10:54:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:55.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.703 --rc genhtml_branch_coverage=1 00:07:55.703 --rc genhtml_function_coverage=1 00:07:55.703 --rc genhtml_legend=1 00:07:55.703 --rc geninfo_all_blocks=1 00:07:55.703 --rc geninfo_unexecuted_blocks=1 00:07:55.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.703 ' 00:07:55.703 10:54:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:55.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.703 --rc genhtml_branch_coverage=1 00:07:55.703 --rc genhtml_function_coverage=1 00:07:55.703 --rc genhtml_legend=1 00:07:55.703 --rc geninfo_all_blocks=1 00:07:55.703 --rc geninfo_unexecuted_blocks=1 00:07:55.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.703 ' 00:07:55.703 10:54:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:55.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.703 --rc genhtml_branch_coverage=1 00:07:55.703 --rc genhtml_function_coverage=1 00:07:55.703 --rc genhtml_legend=1 00:07:55.703 --rc geninfo_all_blocks=1 00:07:55.703 --rc geninfo_unexecuted_blocks=1 00:07:55.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.703 ' 00:07:55.703 10:54:54 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:55.703 10:54:54 -- ../common.sh@8 -- # pids=() 00:07:55.703 10:54:54 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:55.703 10:54:54 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:55.703 10:54:54 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:55.703 10:54:54 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:55.703 10:54:54 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:55.703 10:54:54 -- nvmf/run.sh@61 -- # mem_size=512 00:07:55.703 10:54:54 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:55.703 10:54:54 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:55.703 10:54:54 -- ../common.sh@69 -- # local fuzz_num=25 00:07:55.703 10:54:54 -- ../common.sh@70 -- # local time=1 00:07:55.703 10:54:54 -- ../common.sh@72 -- # (( i = 0 )) 00:07:55.703 10:54:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.703 10:54:54 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:55.703 10:54:54 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:55.703 10:54:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.703 10:54:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.703 10:54:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:55.703 10:54:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:55.703 10:54:54 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:55.703 10:54:54 -- nvmf/run.sh@29 -- # port=4400 00:07:55.703 10:54:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:55.703 10:54:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:55.703 10:54:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.703 10:54:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:55.703 [2024-12-16 10:54:54.233819] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:55.703 [2024-12-16 10:54:54.233906] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid651076 ] 00:07:55.703 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.963 [2024-12-16 10:54:54.424416] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.963 [2024-12-16 10:54:54.443526] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.963 [2024-12-16 10:54:54.443670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.963 [2024-12-16 10:54:54.494922] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.963 [2024-12-16 10:54:54.511253] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:55.963 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.963 INFO: Seed: 1929940532 00:07:55.963 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:07:55.963 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:07:55.963 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:55.963 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.963 #2 INITED exec/s: 0 rss: 59Mb 00:07:55.963 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.963 This may also happen if the target rejected all inputs we tried so far 00:07:55.963 [2024-12-16 10:54:54.556351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:55.963 [2024-12-16 10:54:54.556379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 NEW_FUNC[1/671]: 0x4582b8 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:56.532 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.532 #3 NEW cov: 11558 ft: 11559 corp: 2/65b lim: 320 exec/s: 0 rss: 66Mb L: 64/64 MS: 1 InsertRepeatedBytes- 00:07:56.532 [2024-12-16 10:54:54.877128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.532 [2024-12-16 10:54:54.877158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 #4 NEW cov: 11671 ft: 12151 corp: 3/129b lim: 320 exec/s: 0 rss: 66Mb L: 64/64 MS: 1 ChangeBinInt- 00:07:56.532 [2024-12-16 10:54:54.917195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.532 [2024-12-16 10:54:54.917220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 #5 NEW cov: 11677 ft: 12263 corp: 4/193b lim: 320 exec/s: 0 rss: 66Mb L: 64/64 MS: 1 ShuffleBytes- 00:07:56.532 [2024-12-16 10:54:54.957290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.532 [2024-12-16 10:54:54.957315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 #6 NEW cov: 11762 ft: 12614 corp: 5/257b lim: 320 exec/s: 0 rss: 66Mb L: 64/64 MS: 1 ShuffleBytes- 00:07:56.532 [2024-12-16 10:54:54.997422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.532 [2024-12-16 10:54:54.997447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 #7 NEW cov: 11762 ft: 12700 corp: 6/321b lim: 320 exec/s: 0 rss: 66Mb L: 64/64 MS: 1 ShuffleBytes- 00:07:56.532 [2024-12-16 10:54:55.037523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.532 [2024-12-16 10:54:55.037548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 #8 NEW cov: 11762 ft: 12856 corp: 7/385b lim: 320 exec/s: 0 rss: 66Mb L: 64/64 MS: 1 ChangeByte- 00:07:56.532 [2024-12-16 10:54:55.077755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:4 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.532 [2024-12-16 10:54:55.077780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 NEW_FUNC[1/1]: 0x16e1ab8 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:56.532 #10 NEW cov: 11794 ft: 13219 corp: 8/472b lim: 320 exec/s: 0 rss: 66Mb L: 87/87 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:56.532 [2024-12-16 10:54:55.117770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.532 [2024-12-16 10:54:55.117795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.532 #11 NEW cov: 11794 ft: 13225 corp: 9/536b lim: 320 exec/s: 0 rss: 66Mb L: 64/87 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:56.532 [2024-12-16 10:54:55.147891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.532 [2024-12-16 10:54:55.147915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #12 NEW cov: 11794 ft: 13263 corp: 10/608b lim: 320 exec/s: 0 rss: 66Mb L: 72/87 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:56.791 [2024-12-16 10:54:55.187982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.791 [2024-12-16 10:54:55.188006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #13 NEW cov: 11794 ft: 13347 corp: 11/673b lim: 320 exec/s: 0 rss: 66Mb L: 65/87 MS: 1 CrossOver- 00:07:56.791 [2024-12-16 10:54:55.218083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff 00:07:56.791 [2024-12-16 10:54:55.218108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #14 NEW cov: 11794 ft: 13387 corp: 12/762b lim: 320 exec/s: 0 rss: 67Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:07:56.791 [2024-12-16 10:54:55.258200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.791 [2024-12-16 10:54:55.258225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #15 NEW cov: 11794 ft: 13409 corp: 13/826b lim: 320 exec/s: 0 rss: 67Mb L: 64/89 MS: 1 ShuffleBytes- 00:07:56.791 [2024-12-16 10:54:55.298333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.791 [2024-12-16 10:54:55.298361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #16 NEW cov: 11794 ft: 13423 corp: 14/890b lim: 320 exec/s: 0 rss: 67Mb L: 64/89 MS: 1 ChangeBit- 00:07:56.791 [2024-12-16 10:54:55.338413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffff0a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.791 [2024-12-16 10:54:55.338437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #22 NEW cov: 11794 ft: 13533 corp: 15/956b lim: 320 exec/s: 0 rss: 67Mb L: 66/89 MS: 1 InsertByte- 00:07:56.791 [2024-12-16 10:54:55.378542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.791 [2024-12-16 10:54:55.378566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.791 #23 NEW cov: 11794 ft: 13566 corp: 16/1021b lim: 320 exec/s: 0 rss: 67Mb L: 65/89 MS: 1 InsertByte- 00:07:57.050 [2024-12-16 10:54:55.418679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffaa 00:07:57.050 [2024-12-16 10:54:55.418704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 #24 NEW cov: 11794 ft: 13569 corp: 17/1086b lim: 320 exec/s: 0 rss: 67Mb L: 65/89 MS: 1 InsertByte- 00:07:57.050 [2024-12-16 10:54:55.448776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:29290801 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff0affffffffff 00:07:57.050 [2024-12-16 10:54:55.448801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.050 #26 NEW cov: 11817 ft: 13653 corp: 18/1160b lim: 320 exec/s: 0 rss: 67Mb L: 74/89 MS: 2 CMP-CrossOver- DE: "\377\377\377\377\001\010))"- 00:07:57.050 [2024-12-16 10:54:55.488863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff 00:07:57.050 [2024-12-16 10:54:55.488887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 #27 NEW cov: 11817 ft: 13679 corp: 19/1249b lim: 320 exec/s: 0 rss: 67Mb L: 89/89 MS: 1 ShuffleBytes- 00:07:57.050 [2024-12-16 10:54:55.528988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:7fffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffaa 00:07:57.050 [2024-12-16 10:54:55.529012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 #33 NEW cov: 11817 ft: 13698 corp: 20/1314b lim: 320 exec/s: 33 rss: 67Mb L: 65/89 MS: 1 ChangeBit- 00:07:57.050 [2024-12-16 10:54:55.569126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.050 [2024-12-16 10:54:55.569151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 #34 NEW cov: 11817 ft: 13708 corp: 21/1433b lim: 320 exec/s: 34 rss: 67Mb L: 119/119 MS: 1 CopyPart- 00:07:57.050 [2024-12-16 10:54:55.609223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:1ffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.050 [2024-12-16 10:54:55.609247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 #35 NEW cov: 11817 ft: 13735 corp: 22/1530b lim: 320 exec/s: 35 rss: 67Mb L: 97/119 MS: 1 PersAutoDict- DE: "\377\377\377\377\001\010))"- 00:07:57.050 [2024-12-16 10:54:55.649354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.050 [2024-12-16 10:54:55.649378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.050 #36 NEW cov: 11817 ft: 13749 corp: 23/1602b lim: 320 exec/s: 36 rss: 67Mb L: 72/119 MS: 1 ChangeBinInt- 00:07:57.310 [2024-12-16 10:54:55.689481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.310 [2024-12-16 10:54:55.689507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.310 #37 NEW cov: 11817 ft: 13771 corp: 24/1686b lim: 320 exec/s: 37 rss: 67Mb L: 84/119 MS: 1 EraseBytes- 00:07:57.310 [2024-12-16 10:54:55.729598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:29290801 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff0affffffffff 00:07:57.310 [2024-12-16 10:54:55.729627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.310 #38 NEW cov: 11817 ft: 13832 corp: 25/1760b lim: 320 exec/s: 38 rss: 67Mb L: 74/119 MS: 1 ChangeBinInt- 00:07:57.310 [2024-12-16 10:54:55.769734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.310 [2024-12-16 10:54:55.769757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.310 #39 NEW cov: 11817 ft: 13850 corp: 26/1867b lim: 320 exec/s: 39 rss: 67Mb L: 107/119 MS: 1 CopyPart- 00:07:57.310 [2024-12-16 10:54:55.809820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:29290801 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff0affffffffff 00:07:57.310 [2024-12-16 10:54:55.809845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.310 #40 NEW cov: 11817 ft: 13887 corp: 27/1941b lim: 320 exec/s: 40 rss: 67Mb L: 74/119 MS: 1 PersAutoDict- DE: "\377\377\377\377\001\010))"- 00:07:57.310 [2024-12-16 10:54:55.849942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.310 [2024-12-16 10:54:55.849969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.310 #41 NEW cov: 11817 ft: 13893 corp: 28/2005b lim: 320 exec/s: 41 rss: 67Mb L: 64/119 MS: 1 ChangeBit- 00:07:57.310 [2024-12-16 10:54:55.880025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:29290801 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff0affffffffff 00:07:57.310 [2024-12-16 10:54:55.880050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.310 #42 NEW cov: 11817 ft: 13906 corp: 29/2079b lim: 320 exec/s: 42 rss: 68Mb L: 74/119 MS: 1 ChangeBit- 00:07:57.310 [2024-12-16 10:54:55.920143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:00000000 cdw11:ff2a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff 00:07:57.310 [2024-12-16 10:54:55.920167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 #43 NEW cov: 11817 ft: 13959 corp: 30/2168b lim: 320 exec/s: 43 rss: 68Mb L: 89/119 MS: 1 ChangeByte- 00:07:57.570 [2024-12-16 10:54:55.960414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:88888888 cdw11:88888888 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.570 [2024-12-16 10:54:55.960442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 [2024-12-16 10:54:55.960503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (88) qid:0 cid:5 nsid:88888888 cdw10:88888888 cdw11:88888888 00:07:57.570 [2024-12-16 10:54:55.960517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.570 #44 NEW cov: 11818 ft: 14109 corp: 31/2359b lim: 320 exec/s: 44 rss: 68Mb L: 191/191 MS: 1 InsertRepeatedBytes- 00:07:57.570 [2024-12-16 10:54:56.000380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff29290801ffff 00:07:57.570 [2024-12-16 10:54:56.000405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 #45 NEW cov: 11818 ft: 14123 corp: 32/2431b lim: 320 exec/s: 45 rss: 68Mb L: 72/191 MS: 1 PersAutoDict- DE: "\377\377\377\377\001\010))"- 00:07:57.570 [2024-12-16 10:54:56.040597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:4 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.570 [2024-12-16 10:54:56.040626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 #46 NEW cov: 11818 ft: 14129 corp: 33/2518b lim: 320 exec/s: 46 rss: 68Mb L: 87/191 MS: 1 ChangeByte- 00:07:57.570 [2024-12-16 10:54:56.080623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.570 [2024-12-16 10:54:56.080647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 #47 NEW cov: 11818 ft: 14194 corp: 34/2613b lim: 320 exec/s: 47 rss: 68Mb L: 95/191 MS: 1 CopyPart- 00:07:57.570 [2024-12-16 10:54:56.120741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.570 [2024-12-16 10:54:56.120765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 #48 NEW cov: 11818 ft: 14202 corp: 35/2732b lim: 320 exec/s: 48 rss: 68Mb L: 119/191 MS: 1 CopyPart- 00:07:57.570 [2024-12-16 10:54:56.160878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffff0a cdw11:ff39ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.570 [2024-12-16 10:54:56.160902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.570 #49 NEW cov: 11818 ft: 14206 corp: 36/2799b lim: 320 exec/s: 49 rss: 68Mb L: 67/191 MS: 1 InsertByte- 00:07:57.830 [2024-12-16 10:54:56.200999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.830 [2024-12-16 10:54:56.201023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 #50 NEW cov: 11818 ft: 14212 corp: 37/2863b lim: 320 exec/s: 50 rss: 68Mb L: 64/191 MS: 1 ChangeBinInt- 00:07:57.830 [2024-12-16 10:54:56.231092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.830 [2024-12-16 10:54:56.231116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 #51 NEW cov: 11818 ft: 14252 corp: 38/2927b lim: 320 exec/s: 51 rss: 68Mb L: 64/191 MS: 1 ShuffleBytes- 00:07:57.830 [2024-12-16 10:54:56.261183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:29290801 cdw10:ffffffff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff0affffffffff 00:07:57.830 [2024-12-16 10:54:56.261210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 #52 NEW cov: 11818 ft: 14264 corp: 39/3001b lim: 320 exec/s: 52 rss: 68Mb L: 74/191 MS: 1 ChangeBit- 00:07:57.830 [2024-12-16 10:54:56.301300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.830 [2024-12-16 10:54:56.301325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 #53 NEW cov: 11818 ft: 14270 corp: 40/3096b lim: 320 exec/s: 53 rss: 68Mb L: 95/191 MS: 1 CopyPart- 00:07:57.830 [2024-12-16 10:54:56.341420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:fffffbff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.830 [2024-12-16 10:54:56.341445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 #54 NEW cov: 11818 ft: 14272 corp: 41/3203b lim: 320 exec/s: 54 rss: 68Mb L: 107/191 MS: 1 CrossOver- 00:07:57.830 [2024-12-16 10:54:56.381646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.830 [2024-12-16 10:54:56.381669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 [2024-12-16 10:54:56.381730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.830 [2024-12-16 10:54:56.381751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.830 #55 NEW cov: 11818 ft: 14335 corp: 42/3341b lim: 320 exec/s: 55 rss: 68Mb L: 138/191 MS: 1 CrossOver- 00:07:57.830 [2024-12-16 10:54:56.421905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:71717171 cdw11:71717171 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff 00:07:57.830 [2024-12-16 10:54:56.421928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.830 [2024-12-16 10:54:56.422003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:5 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.830 [2024-12-16 10:54:56.422017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.830 [2024-12-16 10:54:56.422076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:6 nsid:71717171 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.830 [2024-12-16 10:54:56.422089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.830 #56 NEW cov: 11818 ft: 15005 corp: 43/3540b lim: 320 exec/s: 56 rss: 68Mb L: 199/199 MS: 1 InsertRepeatedBytes- 00:07:58.090 [2024-12-16 10:54:56.461749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.090 [2024-12-16 10:54:56.461774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.090 #57 NEW cov: 11818 ft: 15038 corp: 44/3624b lim: 320 exec/s: 57 rss: 68Mb L: 84/199 MS: 1 EraseBytes- 00:07:58.090 [2024-12-16 10:54:56.501850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:58.090 [2024-12-16 10:54:56.501876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.090 #58 NEW cov: 11818 ft: 15073 corp: 45/3696b lim: 320 exec/s: 58 rss: 68Mb L: 72/199 MS: 1 PersAutoDict- DE: "\377\377\377\377\001\010))"- 00:07:58.090 [2024-12-16 10:54:56.541978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:58.090 [2024-12-16 10:54:56.542003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.090 #59 NEW cov: 11818 ft: 15091 corp: 46/3768b lim: 320 exec/s: 29 rss: 68Mb L: 72/199 MS: 1 ShuffleBytes- 00:07:58.090 #59 DONE cov: 11818 ft: 15091 corp: 46/3768b lim: 320 exec/s: 29 rss: 68Mb 00:07:58.090 ###### Recommended dictionary. ###### 00:07:58.090 "\000\000\000\000\000\000\000\000" # Uses: 1 00:07:58.090 "\377\377\377\377\001\010))" # Uses: 5 00:07:58.090 ###### End of recommended dictionary. ###### 00:07:58.090 Done 59 runs in 2 second(s) 00:07:58.090 10:54:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:58.090 10:54:56 -- ../common.sh@72 -- # (( i++ )) 00:07:58.090 10:54:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.090 10:54:56 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:58.090 10:54:56 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:58.090 10:54:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.090 10:54:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.090 10:54:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:58.090 10:54:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:58.090 10:54:56 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:58.090 10:54:56 -- nvmf/run.sh@29 -- # port=4401 00:07:58.090 10:54:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:58.090 10:54:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:58.090 10:54:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.090 10:54:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:58.350 [2024-12-16 10:54:56.726379] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:58.350 [2024-12-16 10:54:56.726469] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid651456 ] 00:07:58.350 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.350 [2024-12-16 10:54:56.911941] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.350 [2024-12-16 10:54:56.931328] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.350 [2024-12-16 10:54:56.931470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.610 [2024-12-16 10:54:56.982893] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.610 [2024-12-16 10:54:56.999221] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:58.610 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.610 INFO: Seed: 121985587 00:07:58.610 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:07:58.610 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:07:58.610 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:58.610 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.610 #2 INITED exec/s: 0 rss: 59Mb 00:07:58.610 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.610 This may also happen if the target rejected all inputs we tried so far 00:07:58.610 [2024-12-16 10:54:57.065047] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.610 [2024-12-16 10:54:57.065226] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.610 [2024-12-16 10:54:57.065384] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.610 [2024-12-16 10:54:57.065529] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a8b 00:07:58.610 [2024-12-16 10:54:57.065862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.610 [2024-12-16 10:54:57.065901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.610 [2024-12-16 10:54:57.066035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.610 [2024-12-16 10:54:57.066054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.610 [2024-12-16 10:54:57.066171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.610 [2024-12-16 10:54:57.066189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.610 [2024-12-16 10:54:57.066311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.610 [2024-12-16 10:54:57.066330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.869 NEW_FUNC[1/671]: 0x458bb8 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:58.869 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.869 #7 NEW cov: 11609 ft: 11623 corp: 2/25b lim: 30 exec/s: 0 rss: 66Mb L: 24/24 MS: 5 InsertByte-CrossOver-InsertByte-CopyPart-InsertRepeatedBytes- 00:07:58.869 [2024-12-16 10:54:57.385946] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.869 [2024-12-16 10:54:57.386115] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.869 [2024-12-16 10:54:57.386268] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.869 [2024-12-16 10:54:57.386422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a3f 00:07:58.869 [2024-12-16 10:54:57.386790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.869 [2024-12-16 10:54:57.386833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.869 [2024-12-16 10:54:57.386948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.869 [2024-12-16 10:54:57.386967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.869 [2024-12-16 10:54:57.387091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.869 [2024-12-16 10:54:57.387111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.870 [2024-12-16 10:54:57.387239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.387258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.870 #13 NEW cov: 11735 ft: 12156 corp: 3/50b lim: 30 exec/s: 0 rss: 66Mb L: 25/25 MS: 1 InsertByte- 00:07:58.870 [2024-12-16 10:54:57.435952] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:58.870 [2024-12-16 10:54:57.436128] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:58.870 [2024-12-16 10:54:57.436276] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:58.870 [2024-12-16 10:54:57.436425] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:58.870 [2024-12-16 10:54:57.436756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.436786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.870 [2024-12-16 10:54:57.436901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.436918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.870 [2024-12-16 10:54:57.437035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.437052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.870 [2024-12-16 10:54:57.437163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.437181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.870 #16 NEW cov: 11741 ft: 12492 corp: 4/77b lim: 30 exec/s: 0 rss: 66Mb L: 27/27 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:07:58.870 [2024-12-16 10:54:57.475941] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.870 [2024-12-16 10:54:57.476119] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.870 [2024-12-16 10:54:57.476281] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:58.870 [2024-12-16 10:54:57.476604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.476637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.870 [2024-12-16 10:54:57.476758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.476773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.870 [2024-12-16 10:54:57.476889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.870 [2024-12-16 10:54:57.476910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.129 #17 NEW cov: 11826 ft: 13221 corp: 5/97b lim: 30 exec/s: 0 rss: 66Mb L: 20/27 MS: 1 EraseBytes- 00:07:59.129 [2024-12-16 10:54:57.516149] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.129 [2024-12-16 10:54:57.516319] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.516457] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.516618] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.516977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.517009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.517131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.517148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.517265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.517282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.517406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.517425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.130 #18 NEW cov: 11826 ft: 13361 corp: 6/125b lim: 30 exec/s: 0 rss: 66Mb L: 28/28 MS: 1 InsertByte- 00:07:59.130 [2024-12-16 10:54:57.556283] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.556445] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.556592] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.556752] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.557093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.557122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.557241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.557258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.557374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.557394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.557516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e2e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.557533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.130 #19 NEW cov: 11826 ft: 13466 corp: 7/153b lim: 30 exec/s: 0 rss: 66Mb L: 28/28 MS: 1 ShuffleBytes- 00:07:59.130 [2024-12-16 10:54:57.596282] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:07:59.130 [2024-12-16 10:54:57.596443] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:07:59.130 [2024-12-16 10:54:57.596785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.596813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.596935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.596951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.130 #22 NEW cov: 11849 ft: 13831 corp: 8/168b lim: 30 exec/s: 0 rss: 66Mb L: 15/28 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:59.130 [2024-12-16 10:54:57.636476] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.636672] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.636851] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.637004] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.637342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.637370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.637490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.637509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.637632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.637649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.637766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.637783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.130 #23 NEW cov: 11849 ft: 13936 corp: 9/196b lim: 30 exec/s: 0 rss: 66Mb L: 28/28 MS: 1 ShuffleBytes- 00:07:59.130 [2024-12-16 10:54:57.676513] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:07:59.130 [2024-12-16 10:54:57.676685] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:07:59.130 [2024-12-16 10:54:57.677028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.677057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.677172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.677190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.130 #24 NEW cov: 11849 ft: 13992 corp: 10/211b lim: 30 exec/s: 0 rss: 66Mb L: 15/28 MS: 1 ShuffleBytes- 00:07:59.130 [2024-12-16 10:54:57.716741] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.716905] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.717049] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.717193] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.130 [2024-12-16 10:54:57.717521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.717549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.717681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e02d2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.717703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.717821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.717840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.130 [2024-12-16 10:54:57.717961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.130 [2024-12-16 10:54:57.717978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.130 #25 NEW cov: 11849 ft: 14101 corp: 11/238b lim: 30 exec/s: 0 rss: 66Mb L: 27/28 MS: 1 ChangeByte- 00:07:59.390 [2024-12-16 10:54:57.756488] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.390 [2024-12-16 10:54:57.756649] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.390 [2024-12-16 10:54:57.756794] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.390 [2024-12-16 10:54:57.756942] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.390 [2024-12-16 10:54:57.757277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.757306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.757430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.757448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.757567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.757583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.757709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.757726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.390 #26 NEW cov: 11849 ft: 14136 corp: 12/267b lim: 30 exec/s: 0 rss: 66Mb L: 29/29 MS: 1 CopyPart- 00:07:59.390 [2024-12-16 10:54:57.807050] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.390 [2024-12-16 10:54:57.807208] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.390 [2024-12-16 10:54:57.807359] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.390 [2024-12-16 10:54:57.807518] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.390 [2024-12-16 10:54:57.807857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.807885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.808006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.808025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.808145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.808163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.808275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e2e020f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.808290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.390 #27 NEW cov: 11849 ft: 14150 corp: 13/295b lim: 30 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 ChangeBit- 00:07:59.390 [2024-12-16 10:54:57.846964] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:07:59.390 [2024-12-16 10:54:57.847144] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:07:59.390 [2024-12-16 10:54:57.847497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.847526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.390 [2024-12-16 10:54:57.847637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.390 [2024-12-16 10:54:57.847655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.390 #28 NEW cov: 11849 ft: 14216 corp: 14/311b lim: 30 exec/s: 0 rss: 67Mb L: 16/29 MS: 1 InsertByte- 00:07:59.390 [2024-12-16 10:54:57.887354] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10284) > buf size (4096) 00:07:59.391 [2024-12-16 10:54:57.887536] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.391 [2024-12-16 10:54:57.887702] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.391 [2024-12-16 10:54:57.887858] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.391 [2024-12-16 10:54:57.888021] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a8b 00:07:59.391 [2024-12-16 10:54:57.888357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a0004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.888386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.888506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:040481a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.888524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.888644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.888662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.888784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.888800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.888928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.888945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.391 #29 NEW cov: 11849 ft: 14314 corp: 15/341b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:59.391 [2024-12-16 10:54:57.927449] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.927630] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.927778] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.927937] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.928272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.928300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.928420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.928437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.928555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.928574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.928693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e2e020f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.928711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.391 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.391 #30 NEW cov: 11872 ft: 14383 corp: 16/369b lim: 30 exec/s: 0 rss: 67Mb L: 28/30 MS: 1 ChangeBit- 00:07:59.391 [2024-12-16 10:54:57.977562] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.977724] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.977877] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.391 [2024-12-16 10:54:57.978027] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000f1f1 00:07:59.391 [2024-12-16 10:54:57.978343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.978371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.978494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.978512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.978627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.978652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.391 [2024-12-16 10:54:57.978774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e2e81f7 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.391 [2024-12-16 10:54:57.978790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.391 #31 NEW cov: 11872 ft: 14397 corp: 17/397b lim: 30 exec/s: 0 rss: 67Mb L: 28/30 MS: 1 ChangeBinInt- 00:07:59.651 [2024-12-16 10:54:58.017539] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.017713] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.017864] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.018015] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.018377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.018407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.018516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.018535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.018664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.018686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.018804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.018821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.651 #32 NEW cov: 11872 ft: 14419 corp: 18/426b lim: 30 exec/s: 32 rss: 67Mb L: 29/30 MS: 1 CopyPart- 00:07:59.651 [2024-12-16 10:54:58.067477] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.067656] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.067809] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.067955] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000f0e 00:07:59.651 [2024-12-16 10:54:58.068099] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003b0a 00:07:59.651 [2024-12-16 10:54:58.068436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.068466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.068584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.068606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.068730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.068748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.068828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.068844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.068964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.068980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.651 #33 NEW cov: 11872 ft: 14456 corp: 19/456b lim: 30 exec/s: 33 rss: 67Mb L: 30/30 MS: 1 CrossOver- 00:07:59.651 [2024-12-16 10:54:58.128010] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.128177] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.128339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.128492] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.128871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.128901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.129017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.129034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.129159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.129180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.129305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e2e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.129324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.651 #34 NEW cov: 11872 ft: 14581 corp: 20/484b lim: 30 exec/s: 34 rss: 67Mb L: 28/30 MS: 1 CrossOver- 00:07:59.651 [2024-12-16 10:54:58.167916] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.168079] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002e0e 00:07:59.651 [2024-12-16 10:54:58.168225] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e3b 00:07:59.651 [2024-12-16 10:54:58.168572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.168601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.168728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.168748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.168865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.168883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 #35 NEW cov: 11872 ft: 14595 corp: 21/503b lim: 30 exec/s: 35 rss: 67Mb L: 19/30 MS: 1 EraseBytes- 00:07:59.651 [2024-12-16 10:54:58.208323] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.208504] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.208670] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.651 [2024-12-16 10:54:58.208822] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000ea9 00:07:59.651 [2024-12-16 10:54:58.208969] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003f8b 00:07:59.651 [2024-12-16 10:54:58.209354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.209384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.209507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.209529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.209658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.209678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.209805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.209825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.209949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:a9a902a9 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.209969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.651 #36 NEW cov: 11872 ft: 14625 corp: 22/533b lim: 30 exec/s: 36 rss: 67Mb L: 30/30 MS: 1 CrossOver- 00:07:59.651 [2024-12-16 10:54:58.257941] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.258102] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.651 [2024-12-16 10:54:58.258448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.258478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.651 [2024-12-16 10:54:58.258597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.651 [2024-12-16 10:54:58.258617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.911 #37 NEW cov: 11872 ft: 14643 corp: 23/545b lim: 30 exec/s: 37 rss: 67Mb L: 12/30 MS: 1 CrossOver- 00:07:59.911 [2024-12-16 10:54:58.298247] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.298424] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.298789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.298820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 [2024-12-16 10:54:58.298947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.298967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.911 #38 NEW cov: 11872 ft: 14701 corp: 24/559b lim: 30 exec/s: 38 rss: 67Mb L: 14/30 MS: 1 EraseBytes- 00:07:59.911 [2024-12-16 10:54:58.348695] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.348859] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.349019] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.349173] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.349528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.349560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.911 [2024-12-16 10:54:58.349681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.349700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.911 [2024-12-16 10:54:58.349817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.349833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.911 [2024-12-16 10:54:58.349959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e2e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.349976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.911 #39 NEW cov: 11872 ft: 14712 corp: 25/588b lim: 30 exec/s: 39 rss: 67Mb L: 29/30 MS: 1 InsertByte- 00:07:59.911 [2024-12-16 10:54:58.388799] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.388963] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000e0e 00:07:59.911 [2024-12-16 10:54:58.389130] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.389291] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.911 [2024-12-16 10:54:58.389630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.911 [2024-12-16 10:54:58.389659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.389786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e830e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.389808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.389955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.389973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.390096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e022e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.390115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.912 #40 NEW cov: 11872 ft: 14742 corp: 26/617b lim: 30 exec/s: 40 rss: 67Mb L: 29/30 MS: 1 InsertByte- 00:07:59.912 [2024-12-16 10:54:58.449055] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.912 [2024-12-16 10:54:58.449234] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.912 [2024-12-16 10:54:58.449388] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.912 [2024-12-16 10:54:58.449540] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:07:59.912 [2024-12-16 10:54:58.449703] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003f8b 00:07:59.912 [2024-12-16 10:54:58.450099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a81a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.450129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.450249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.450267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.450389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.450408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.450530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.450548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.450687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:a9a902b8 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.450708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.912 #41 NEW cov: 11872 ft: 14775 corp: 27/647b lim: 30 exec/s: 41 rss: 67Mb L: 30/30 MS: 1 InsertByte- 00:07:59.912 [2024-12-16 10:54:58.499074] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:59.912 [2024-12-16 10:54:58.499225] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:59.912 [2024-12-16 10:54:58.499372] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:07:59.912 [2024-12-16 10:54:58.499733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8e0e830e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.499763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.499888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.499906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.912 [2024-12-16 10:54:58.500029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.912 [2024-12-16 10:54:58.500047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.912 #42 NEW cov: 11872 ft: 14777 corp: 28/669b lim: 30 exec/s: 42 rss: 67Mb L: 22/30 MS: 1 InsertRepeatedBytes- 00:08:00.172 [2024-12-16 10:54:58.559316] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:08:00.172 [2024-12-16 10:54:58.559489] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:08:00.172 [2024-12-16 10:54:58.559656] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a9a9 00:08:00.172 [2024-12-16 10:54:58.559806] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a90a 00:08:00.172 [2024-12-16 10:54:58.560180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:260a810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.560212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.560339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.560357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.560478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.560496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.560617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a9a981a9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.560635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #43 NEW cov: 11872 ft: 14794 corp: 29/694b lim: 30 exec/s: 43 rss: 67Mb L: 25/30 MS: 1 InsertByte- 00:08:00.172 [2024-12-16 10:54:58.599413] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.172 [2024-12-16 10:54:58.599577] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.599737] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.599887] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.600263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.600292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.600373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0300020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.600391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.600513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.600530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.600657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.600674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #44 NEW cov: 11872 ft: 14809 corp: 30/721b lim: 30 exec/s: 44 rss: 67Mb L: 27/30 MS: 1 CMP- DE: "\377\377\377\377\377\377\003\000"- 00:08:00.172 [2024-12-16 10:54:58.639544] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.172 [2024-12-16 10:54:58.639731] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.639884] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.640034] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.640397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.640426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.640551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0300020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.640572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.640659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.640688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.640809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.640826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #45 NEW cov: 11872 ft: 14825 corp: 31/748b lim: 30 exec/s: 45 rss: 68Mb L: 27/30 MS: 1 ChangeBit- 00:08:00.172 [2024-12-16 10:54:58.689032] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:08:00.172 [2024-12-16 10:54:58.689386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.689413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 #46 NEW cov: 11872 ft: 15231 corp: 32/758b lim: 30 exec/s: 46 rss: 68Mb L: 10/30 MS: 1 EraseBytes- 00:08:00.172 [2024-12-16 10:54:58.729867] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.730052] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.730202] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.730368] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.730730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.730760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.730887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e02d2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.730904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.731043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.731061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.731195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.731215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.172 #47 NEW cov: 11872 ft: 15239 corp: 33/785b lim: 30 exec/s: 47 rss: 68Mb L: 27/30 MS: 1 ShuffleBytes- 00:08:00.172 [2024-12-16 10:54:58.769384] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.172 [2024-12-16 10:54:58.769551] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.172 [2024-12-16 10:54:58.769725] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.172 [2024-12-16 10:54:58.770064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8e0e830e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.770096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.770223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:001683ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.770242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.172 [2024-12-16 10:54:58.770362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.172 [2024-12-16 10:54:58.770381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.432 #48 NEW cov: 11872 ft: 15253 corp: 34/807b lim: 30 exec/s: 48 rss: 68Mb L: 22/30 MS: 1 ChangeBinInt- 00:08:00.432 [2024-12-16 10:54:58.820041] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.432 [2024-12-16 10:54:58.820215] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.432 [2024-12-16 10:54:58.820378] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003b0a 00:08:00.432 [2024-12-16 10:54:58.820727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.432 [2024-12-16 10:54:58.820757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.432 [2024-12-16 10:54:58.820874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.432 [2024-12-16 10:54:58.820891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.432 [2024-12-16 10:54:58.821004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.432 [2024-12-16 10:54:58.821020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.432 #49 NEW cov: 11872 ft: 15273 corp: 35/825b lim: 30 exec/s: 49 rss: 68Mb L: 18/30 MS: 1 EraseBytes- 00:08:00.432 [2024-12-16 10:54:58.859978] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.432 [2024-12-16 10:54:58.860139] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.432 [2024-12-16 10:54:58.860476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.432 [2024-12-16 10:54:58.860504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.860620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.860639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 #50 NEW cov: 11872 ft: 15306 corp: 36/841b lim: 30 exec/s: 50 rss: 68Mb L: 16/30 MS: 1 EraseBytes- 00:08:00.433 [2024-12-16 10:54:58.900298] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.900468] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.900627] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.900781] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.901122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.901162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.901286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.901303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.901422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.901441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.901556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.901572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.433 #51 NEW cov: 11872 ft: 15310 corp: 37/868b lim: 30 exec/s: 51 rss: 68Mb L: 27/30 MS: 1 ShuffleBytes- 00:08:00.433 [2024-12-16 10:54:58.940348] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.940496] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.940668] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.433 [2024-12-16 10:54:58.940812] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003d3b 00:08:00.433 [2024-12-16 10:54:58.941186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.941215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.941331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.941349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.941471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.941490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.941608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.941629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.433 #52 NEW cov: 11872 ft: 15314 corp: 38/897b lim: 30 exec/s: 52 rss: 68Mb L: 29/30 MS: 1 CopyPart- 00:08:00.433 [2024-12-16 10:54:58.980412] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:08:00.433 [2024-12-16 10:54:58.980590] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x745b 00:08:00.433 [2024-12-16 10:54:58.980742] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5be2 00:08:00.433 [2024-12-16 10:54:58.981070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.981099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.981215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.981236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:58.981351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:58.981368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.433 #53 NEW cov: 11872 ft: 15344 corp: 39/915b lim: 30 exec/s: 53 rss: 68Mb L: 18/30 MS: 1 CopyPart- 00:08:00.433 [2024-12-16 10:54:59.020432] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:08:00.433 [2024-12-16 10:54:59.020594] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119252) > buf size (4096) 00:08:00.433 [2024-12-16 10:54:59.020956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:59.020985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.433 [2024-12-16 10:54:59.021100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.433 [2024-12-16 10:54:59.021116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.433 #54 NEW cov: 11872 ft: 15350 corp: 40/930b lim: 30 exec/s: 54 rss: 68Mb L: 15/30 MS: 1 CrossOver- 00:08:00.693 [2024-12-16 10:54:59.060675] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:08:00.693 [2024-12-16 10:54:59.060830] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.693 [2024-12-16 10:54:59.060975] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:08:00.693 [2024-12-16 10:54:59.061305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8e8e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.693 [2024-12-16 10:54:59.061336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.693 [2024-12-16 10:54:59.061446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.693 [2024-12-16 10:54:59.061463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.693 [2024-12-16 10:54:59.061582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.693 [2024-12-16 10:54:59.061599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.693 #55 NEW cov: 11872 ft: 15362 corp: 41/953b lim: 30 exec/s: 27 rss: 68Mb L: 23/30 MS: 1 CrossOver- 00:08:00.693 #55 DONE cov: 11872 ft: 15362 corp: 41/953b lim: 30 exec/s: 27 rss: 68Mb 00:08:00.693 ###### Recommended dictionary. ###### 00:08:00.693 "\377\377\377\377\377\377\003\000" # Uses: 0 00:08:00.693 ###### End of recommended dictionary. ###### 00:08:00.693 Done 55 runs in 2 second(s) 00:08:00.693 10:54:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:08:00.693 10:54:59 -- ../common.sh@72 -- # (( i++ )) 00:08:00.693 10:54:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.693 10:54:59 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:00.693 10:54:59 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:00.693 10:54:59 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.693 10:54:59 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.693 10:54:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:00.693 10:54:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:00.693 10:54:59 -- nvmf/run.sh@29 -- # printf %02d 2 00:08:00.693 10:54:59 -- nvmf/run.sh@29 -- # port=4402 00:08:00.693 10:54:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:00.693 10:54:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:00.693 10:54:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.693 10:54:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:08:00.693 [2024-12-16 10:54:59.244710] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:00.693 [2024-12-16 10:54:59.244795] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid651910 ] 00:08:00.693 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.953 [2024-12-16 10:54:59.417935] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.953 [2024-12-16 10:54:59.436893] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.953 [2024-12-16 10:54:59.437029] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.953 [2024-12-16 10:54:59.488269] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.953 [2024-12-16 10:54:59.504585] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:00.953 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.953 INFO: Seed: 2627991426 00:08:00.953 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:00.953 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:00.953 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:00.953 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.953 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.953 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.953 This may also happen if the target rejected all inputs we tried so far 00:08:00.953 [2024-12-16 10:54:59.559951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.953 [2024-12-16 10:54:59.559978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 NEW_FUNC[1/670]: 0x45b5d8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:01.471 NEW_FUNC[2/670]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.471 #13 NEW cov: 11580 ft: 11548 corp: 2/10b lim: 35 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.471 [2024-12-16 10:54:59.860427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.471 [2024-12-16 10:54:59.860458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 #14 NEW cov: 11693 ft: 12013 corp: 3/17b lim: 35 exec/s: 0 rss: 66Mb L: 7/9 MS: 1 EraseBytes- 00:08:01.471 [2024-12-16 10:54:59.900531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f4 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.471 [2024-12-16 10:54:59.900556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 #18 NEW cov: 11699 ft: 12342 corp: 4/24b lim: 35 exec/s: 0 rss: 66Mb L: 7/9 MS: 4 CrossOver-ChangeBinInt-CMP-CopyPart- DE: "\364\377\377\377"- 00:08:01.471 [2024-12-16 10:54:59.940582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.471 [2024-12-16 10:54:59.940615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 #19 NEW cov: 11784 ft: 12576 corp: 5/31b lim: 35 exec/s: 0 rss: 66Mb L: 7/9 MS: 1 ChangeBinInt- 00:08:01.471 [2024-12-16 10:54:59.980718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb000013 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.471 [2024-12-16 10:54:59.980743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 #20 NEW cov: 11784 ft: 12698 corp: 6/38b lim: 35 exec/s: 0 rss: 66Mb L: 7/9 MS: 1 ChangeBinInt- 00:08:01.471 [2024-12-16 10:55:00.020830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.471 [2024-12-16 10:55:00.020856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 #21 NEW cov: 11784 ft: 12766 corp: 7/45b lim: 35 exec/s: 0 rss: 66Mb L: 7/9 MS: 1 CrossOver- 00:08:01.471 [2024-12-16 10:55:00.060962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f3ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.471 [2024-12-16 10:55:00.060990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.471 #22 NEW cov: 11784 ft: 12842 corp: 8/52b lim: 35 exec/s: 0 rss: 67Mb L: 7/9 MS: 1 ShuffleBytes- 00:08:01.730 [2024-12-16 10:55:00.101039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ea000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.730 [2024-12-16 10:55:00.101064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.730 #23 NEW cov: 11784 ft: 12857 corp: 9/60b lim: 35 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 InsertByte- 00:08:01.730 [2024-12-16 10:55:00.141148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00f4000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.141173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.731 #24 NEW cov: 11784 ft: 12935 corp: 10/67b lim: 35 exec/s: 0 rss: 67Mb L: 7/9 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:01.731 [2024-12-16 10:55:00.171209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ea000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.171233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.731 #25 NEW cov: 11784 ft: 12984 corp: 11/76b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 InsertByte- 00:08:01.731 [2024-12-16 10:55:00.211383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.211407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.731 #26 NEW cov: 11784 ft: 13023 corp: 12/83b lim: 35 exec/s: 0 rss: 67Mb L: 7/9 MS: 1 ShuffleBytes- 00:08:01.731 [2024-12-16 10:55:00.241449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0100000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.241473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.731 #27 NEW cov: 11784 ft: 13095 corp: 13/90b lim: 35 exec/s: 0 rss: 67Mb L: 7/9 MS: 1 ChangeBinInt- 00:08:01.731 [2024-12-16 10:55:00.271536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:0000002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.271561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.731 #28 NEW cov: 11784 ft: 13105 corp: 14/98b lim: 35 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 InsertByte- 00:08:01.731 [2024-12-16 10:55:00.311639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f4 cdw11:ff002fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.311663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.731 #29 NEW cov: 11784 ft: 13118 corp: 15/105b lim: 35 exec/s: 0 rss: 67Mb L: 7/9 MS: 1 ChangeByte- 00:08:01.731 [2024-12-16 10:55:00.351768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:0000002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.731 [2024-12-16 10:55:00.351793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.990 #30 NEW cov: 11784 ft: 13240 corp: 16/113b lim: 35 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 ChangeBit- 00:08:01.990 [2024-12-16 10:55:00.391855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4ff000a cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.391879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.990 #36 NEW cov: 11784 ft: 13260 corp: 17/126b lim: 35 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:01.990 [2024-12-16 10:55:00.432232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:88880088 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.432257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.990 [2024-12-16 10:55:00.432309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:88880088 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.432323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.990 [2024-12-16 10:55:00.432376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:88880088 cdw11:0a008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.432389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.990 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.990 #37 NEW cov: 11807 ft: 13677 corp: 18/153b lim: 35 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:01.990 [2024-12-16 10:55:00.472195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000500f4 cdw11:0500531b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.472221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.990 [2024-12-16 10:55:00.472275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1eff0094 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.472289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.990 #38 NEW cov: 11807 ft: 13899 corp: 19/168b lim: 35 exec/s: 0 rss: 67Mb L: 15/27 MS: 1 CMP- DE: "\000\005S\033\005\272\224\036"- 00:08:01.990 [2024-12-16 10:55:00.512456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.512480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.990 [2024-12-16 10:55:00.512533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.512550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.990 [2024-12-16 10:55:00.512601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:68680068 cdw11:ea000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.512618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.990 #39 NEW cov: 11807 ft: 13903 corp: 20/193b lim: 35 exec/s: 0 rss: 67Mb L: 25/27 MS: 1 InsertRepeatedBytes- 00:08:01.990 [2024-12-16 10:55:00.552325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0100000a cdw11:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.552350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.990 #40 NEW cov: 11807 ft: 13919 corp: 21/200b lim: 35 exec/s: 40 rss: 67Mb L: 7/27 MS: 1 ChangeBit- 00:08:01.990 [2024-12-16 10:55:00.592435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f3ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.990 [2024-12-16 10:55:00.592459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 #41 NEW cov: 11807 ft: 13942 corp: 22/211b lim: 35 exec/s: 41 rss: 67Mb L: 11/27 MS: 1 CopyPart- 00:08:02.250 [2024-12-16 10:55:00.632545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.632569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 #42 NEW cov: 11807 ft: 13953 corp: 23/219b lim: 35 exec/s: 42 rss: 67Mb L: 8/27 MS: 1 InsertByte- 00:08:02.250 [2024-12-16 10:55:00.662638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.662662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 #43 NEW cov: 11807 ft: 13961 corp: 24/231b lim: 35 exec/s: 43 rss: 67Mb L: 12/27 MS: 1 CrossOver- 00:08:02.250 [2024-12-16 10:55:00.692662] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:02.250 [2024-12-16 10:55:00.692890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.692915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 [2024-12-16 10:55:00.692968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:17000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.692983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.250 #44 NEW cov: 11816 ft: 14040 corp: 25/245b lim: 35 exec/s: 44 rss: 68Mb L: 14/27 MS: 1 CopyPart- 00:08:02.250 [2024-12-16 10:55:00.732851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00f4000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.732875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 #45 NEW cov: 11816 ft: 14057 corp: 26/256b lim: 35 exec/s: 45 rss: 68Mb L: 11/27 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:02.250 [2024-12-16 10:55:00.773189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.773213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 [2024-12-16 10:55:00.773287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:686a0068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.773300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.250 [2024-12-16 10:55:00.773353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:68680068 cdw11:ea000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.773366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.250 #46 NEW cov: 11816 ft: 14062 corp: 27/281b lim: 35 exec/s: 46 rss: 68Mb L: 25/27 MS: 1 ChangeBit- 00:08:02.250 [2024-12-16 10:55:00.813091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f4 cdw11:ff002fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.813115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 #47 NEW cov: 11816 ft: 14071 corp: 28/288b lim: 35 exec/s: 47 rss: 68Mb L: 7/27 MS: 1 ChangeBinInt- 00:08:02.250 [2024-12-16 10:55:00.853170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f4 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.250 [2024-12-16 10:55:00.853195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.250 #48 NEW cov: 11816 ft: 14126 corp: 29/298b lim: 35 exec/s: 48 rss: 68Mb L: 10/27 MS: 1 CopyPart- 00:08:02.510 [2024-12-16 10:55:00.883394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00050075 cdw11:0500531b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.883419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:00.883472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1eff0094 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.883487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.510 #49 NEW cov: 11816 ft: 14170 corp: 30/313b lim: 35 exec/s: 49 rss: 68Mb L: 15/27 MS: 1 ChangeByte- 00:08:02.510 [2024-12-16 10:55:00.923372] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:02.510 [2024-12-16 10:55:00.923697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ea000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.923723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:00.923777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:5c5c0000 cdw11:5c005c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.923793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:00.923844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:5c5c005c cdw11:5c005c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.923858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.510 #50 NEW cov: 11816 ft: 14177 corp: 31/337b lim: 35 exec/s: 50 rss: 68Mb L: 24/27 MS: 1 InsertRepeatedBytes- 00:08:02.510 [2024-12-16 10:55:00.963765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fb00000a cdw11:1c00001c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.963789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:00.963871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1c1c001c cdw11:1c001c1c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.963885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:00.963935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1c1c001c cdw11:00001c1c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:00.963948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.510 NEW_FUNC[1/1]: 0x11389f8 in nvmf_ctrlr_identify_iocs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3051 00:08:02.510 #51 NEW cov: 11835 ft: 14208 corp: 32/359b lim: 35 exec/s: 51 rss: 68Mb L: 22/27 MS: 1 InsertRepeatedBytes- 00:08:02.510 [2024-12-16 10:55:01.004018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68680068 cdw11:d2006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.004043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:01.004097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d2d200d2 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.004110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:01.004162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.004176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:01.004226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0a000068 cdw11:0000ea00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.004239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.510 #52 NEW cov: 11835 ft: 14714 corp: 33/389b lim: 35 exec/s: 52 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:02.510 [2024-12-16 10:55:01.043971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.043995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:01.044048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.044062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.510 [2024-12-16 10:55:01.044113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:68680068 cdw11:ea000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.044126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.510 #53 NEW cov: 11835 ft: 14723 corp: 34/414b lim: 35 exec/s: 53 rss: 68Mb L: 25/30 MS: 1 ShuffleBytes- 00:08:02.510 [2024-12-16 10:55:01.083858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0000e4 cdw11:0000e40a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.083883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.510 #57 NEW cov: 11835 ft: 14737 corp: 35/424b lim: 35 exec/s: 57 rss: 68Mb L: 10/30 MS: 4 EraseBytes-ShuffleBytes-InsertByte-CopyPart- 00:08:02.510 [2024-12-16 10:55:01.113951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a00fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.510 [2024-12-16 10:55:01.113980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 #58 NEW cov: 11835 ft: 14749 corp: 36/432b lim: 35 exec/s: 58 rss: 68Mb L: 8/30 MS: 1 ShuffleBytes- 00:08:02.770 [2024-12-16 10:55:01.154051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00f4000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.154076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 #59 NEW cov: 11835 ft: 14779 corp: 37/440b lim: 35 exec/s: 59 rss: 68Mb L: 8/30 MS: 1 InsertByte- 00:08:02.770 [2024-12-16 10:55:01.194422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.194447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 [2024-12-16 10:55:01.194501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:68680068 cdw11:6a006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.194514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.770 [2024-12-16 10:55:01.194567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:68680068 cdw11:ea000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.194597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.770 #60 NEW cov: 11835 ft: 14783 corp: 38/465b lim: 35 exec/s: 60 rss: 68Mb L: 25/30 MS: 1 ShuffleBytes- 00:08:02.770 [2024-12-16 10:55:01.234545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.234570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 [2024-12-16 10:55:01.234625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:68680068 cdw11:68006868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.234639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.770 [2024-12-16 10:55:01.234690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:68680068 cdw11:ea000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.234703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.770 #61 NEW cov: 11835 ft: 14793 corp: 39/490b lim: 35 exec/s: 61 rss: 68Mb L: 25/30 MS: 1 ChangeBit- 00:08:02.770 [2024-12-16 10:55:01.274418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f4 cdw11:ff00f4ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.274442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 #62 NEW cov: 11835 ft: 14831 corp: 40/497b lim: 35 exec/s: 62 rss: 68Mb L: 7/30 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:02.770 [2024-12-16 10:55:01.314519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fbfb000a cdw11:2b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.314544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 #63 NEW cov: 11835 ft: 14851 corp: 41/505b lim: 35 exec/s: 63 rss: 68Mb L: 8/30 MS: 1 CopyPart- 00:08:02.770 [2024-12-16 10:55:01.354627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fbfb000a cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.770 [2024-12-16 10:55:01.354655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.770 #64 NEW cov: 11835 ft: 14861 corp: 42/517b lim: 35 exec/s: 64 rss: 68Mb L: 12/30 MS: 1 CMP- DE: "\001\000\000\227"- 00:08:03.030 [2024-12-16 10:55:01.394747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff00f4 cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.030 [2024-12-16 10:55:01.394772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.030 #65 NEW cov: 11835 ft: 14865 corp: 43/524b lim: 35 exec/s: 65 rss: 68Mb L: 7/30 MS: 1 ShuffleBytes- 00:08:03.030 [2024-12-16 10:55:01.424846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ea000a cdw11:00008300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.030 [2024-12-16 10:55:01.424870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.030 #66 NEW cov: 11835 ft: 14869 corp: 44/532b lim: 35 exec/s: 66 rss: 68Mb L: 8/30 MS: 1 ChangeByte- 00:08:03.030 [2024-12-16 10:55:01.454938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff1300f4 cdw11:0000fb00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.030 [2024-12-16 10:55:01.454963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.030 #67 NEW cov: 11835 ft: 14873 corp: 45/545b lim: 35 exec/s: 67 rss: 68Mb L: 13/30 MS: 1 CrossOver- 00:08:03.030 [2024-12-16 10:55:01.485051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00005b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.030 [2024-12-16 10:55:01.485076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.030 #68 NEW cov: 11835 ft: 14876 corp: 46/553b lim: 35 exec/s: 68 rss: 68Mb L: 8/30 MS: 1 InsertByte- 00:08:03.030 [2024-12-16 10:55:01.515134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4ff000a cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.030 [2024-12-16 10:55:01.515158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.030 #69 NEW cov: 11835 ft: 14889 corp: 47/566b lim: 35 exec/s: 34 rss: 69Mb L: 13/30 MS: 1 ChangeBit- 00:08:03.030 #69 DONE cov: 11835 ft: 14889 corp: 47/566b lim: 35 exec/s: 34 rss: 69Mb 00:08:03.030 ###### Recommended dictionary. ###### 00:08:03.030 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:03.030 "\364\377\377\377" # Uses: 4 00:08:03.030 "\000\005S\033\005\272\224\036" # Uses: 0 00:08:03.030 "\001\000\000\227" # Uses: 0 00:08:03.030 ###### End of recommended dictionary. ###### 00:08:03.030 Done 69 runs in 2 second(s) 00:08:03.030 10:55:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:08:03.290 10:55:01 -- ../common.sh@72 -- # (( i++ )) 00:08:03.290 10:55:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.290 10:55:01 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:03.290 10:55:01 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:03.290 10:55:01 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.290 10:55:01 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.290 10:55:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:03.290 10:55:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:03.290 10:55:01 -- nvmf/run.sh@29 -- # printf %02d 3 00:08:03.290 10:55:01 -- nvmf/run.sh@29 -- # port=4403 00:08:03.290 10:55:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:03.290 10:55:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:03.290 10:55:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.290 10:55:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:08:03.290 [2024-12-16 10:55:01.696032] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:03.290 [2024-12-16 10:55:01.696098] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid652451 ] 00:08:03.290 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.290 [2024-12-16 10:55:01.871719] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.290 [2024-12-16 10:55:01.891171] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.290 [2024-12-16 10:55:01.891296] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.549 [2024-12-16 10:55:01.942821] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.549 [2024-12-16 10:55:01.959144] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:03.549 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.549 INFO: Seed: 788018726 00:08:03.549 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:03.549 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:03.549 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:03.549 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.549 #2 INITED exec/s: 0 rss: 59Mb 00:08:03.549 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.549 This may also happen if the target rejected all inputs we tried so far 00:08:03.809 NEW_FUNC[1/658]: 0x45d2b8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:03.809 NEW_FUNC[2/658]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.809 #17 NEW cov: 11471 ft: 11462 corp: 2/6b lim: 20 exec/s: 0 rss: 66Mb L: 5/5 MS: 5 CopyPart-ChangeByte-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:03.809 [2024-12-16 10:55:02.315337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:03.809 [2024-12-16 10:55:02.315373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.809 NEW_FUNC[1/21]: 0x113bbe8 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:08:03.809 NEW_FUNC[2/21]: 0x113c768 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:08:03.809 #19 NEW cov: 11931 ft: 12707 corp: 3/21b lim: 20 exec/s: 0 rss: 66Mb L: 15/15 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:03.809 #20 NEW cov: 11937 ft: 12987 corp: 4/34b lim: 20 exec/s: 0 rss: 66Mb L: 13/15 MS: 1 CMP- DE: "\000\005S\034\036\251\277*"- 00:08:03.809 #21 NEW cov: 12022 ft: 13320 corp: 5/48b lim: 20 exec/s: 0 rss: 66Mb L: 14/15 MS: 1 InsertByte- 00:08:04.069 #22 NEW cov: 12022 ft: 13395 corp: 6/61b lim: 20 exec/s: 0 rss: 66Mb L: 13/15 MS: 1 CrossOver- 00:08:04.069 [2024-12-16 10:55:02.485723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.069 [2024-12-16 10:55:02.485752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.069 #23 NEW cov: 12022 ft: 13592 corp: 7/76b lim: 20 exec/s: 0 rss: 66Mb L: 15/15 MS: 1 ChangeBinInt- 00:08:04.069 #24 NEW cov: 12022 ft: 13664 corp: 8/81b lim: 20 exec/s: 0 rss: 66Mb L: 5/15 MS: 1 ChangeBit- 00:08:04.069 #27 NEW cov: 12038 ft: 13862 corp: 9/101b lim: 20 exec/s: 0 rss: 66Mb L: 20/20 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:04.069 #28 NEW cov: 12038 ft: 13960 corp: 10/115b lim: 20 exec/s: 0 rss: 66Mb L: 14/20 MS: 1 ChangeBinInt- 00:08:04.069 #29 NEW cov: 12038 ft: 14009 corp: 11/135b lim: 20 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 PersAutoDict- DE: "\000\005S\034\036\251\277*"- 00:08:04.329 #30 NEW cov: 12038 ft: 14031 corp: 12/140b lim: 20 exec/s: 0 rss: 67Mb L: 5/20 MS: 1 ShuffleBytes- 00:08:04.329 [2024-12-16 10:55:02.726494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.329 [2024-12-16 10:55:02.726522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 #31 NEW cov: 12038 ft: 14072 corp: 13/155b lim: 20 exec/s: 0 rss: 67Mb L: 15/20 MS: 1 CopyPart- 00:08:04.329 #32 NEW cov: 12038 ft: 14171 corp: 14/170b lim: 20 exec/s: 0 rss: 67Mb L: 15/20 MS: 1 InsertByte- 00:08:04.329 [2024-12-16 10:55:02.806897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.329 [2024-12-16 10:55:02.806924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 #33 NEW cov: 12042 ft: 14269 corp: 15/186b lim: 20 exec/s: 0 rss: 67Mb L: 16/20 MS: 1 CopyPart- 00:08:04.329 #34 NEW cov: 12042 ft: 14309 corp: 16/206b lim: 20 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CopyPart- 00:08:04.329 [2024-12-16 10:55:02.886940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.329 [2024-12-16 10:55:02.886966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.329 #35 NEW cov: 12065 ft: 14369 corp: 17/218b lim: 20 exec/s: 0 rss: 67Mb L: 12/20 MS: 1 EraseBytes- 00:08:04.329 #36 NEW cov: 12065 ft: 14382 corp: 18/236b lim: 20 exec/s: 0 rss: 67Mb L: 18/20 MS: 1 EraseBytes- 00:08:04.589 #37 NEW cov: 12065 ft: 14404 corp: 19/248b lim: 20 exec/s: 0 rss: 67Mb L: 12/20 MS: 1 EraseBytes- 00:08:04.589 #38 NEW cov: 12066 ft: 14622 corp: 20/259b lim: 20 exec/s: 38 rss: 67Mb L: 11/20 MS: 1 EraseBytes- 00:08:04.589 #39 NEW cov: 12066 ft: 14647 corp: 21/271b lim: 20 exec/s: 39 rss: 67Mb L: 12/20 MS: 1 InsertByte- 00:08:04.589 #40 NEW cov: 12066 ft: 14660 corp: 22/285b lim: 20 exec/s: 40 rss: 67Mb L: 14/20 MS: 1 InsertByte- 00:08:04.589 [2024-12-16 10:55:03.127660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.589 [2024-12-16 10:55:03.127687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.589 #41 NEW cov: 12066 ft: 14673 corp: 23/297b lim: 20 exec/s: 41 rss: 67Mb L: 12/20 MS: 1 ShuffleBytes- 00:08:04.589 #42 NEW cov: 12066 ft: 14689 corp: 24/315b lim: 20 exec/s: 42 rss: 67Mb L: 18/20 MS: 1 ChangeBinInt- 00:08:04.848 #43 NEW cov: 12066 ft: 14698 corp: 25/330b lim: 20 exec/s: 43 rss: 67Mb L: 15/20 MS: 1 InsertByte- 00:08:04.848 #44 NEW cov: 12066 ft: 14723 corp: 26/337b lim: 20 exec/s: 44 rss: 67Mb L: 7/20 MS: 1 EraseBytes- 00:08:04.848 #45 NEW cov: 12066 ft: 14731 corp: 27/357b lim: 20 exec/s: 45 rss: 67Mb L: 20/20 MS: 1 CrossOver- 00:08:04.848 #46 NEW cov: 12066 ft: 14810 corp: 28/369b lim: 20 exec/s: 46 rss: 67Mb L: 12/20 MS: 1 EraseBytes- 00:08:04.848 #47 NEW cov: 12066 ft: 14853 corp: 29/375b lim: 20 exec/s: 47 rss: 67Mb L: 6/20 MS: 1 InsertByte- 00:08:04.848 [2024-12-16 10:55:03.408641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.848 [2024-12-16 10:55:03.408668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.848 #48 NEW cov: 12066 ft: 14864 corp: 30/391b lim: 20 exec/s: 48 rss: 68Mb L: 16/20 MS: 1 ShuffleBytes- 00:08:05.107 #49 NEW cov: 12066 ft: 14884 corp: 31/402b lim: 20 exec/s: 49 rss: 68Mb L: 11/20 MS: 1 EraseBytes- 00:08:05.107 #50 NEW cov: 12066 ft: 14941 corp: 32/414b lim: 20 exec/s: 50 rss: 68Mb L: 12/20 MS: 1 ChangeBinInt- 00:08:05.107 #51 NEW cov: 12066 ft: 14942 corp: 33/432b lim: 20 exec/s: 51 rss: 68Mb L: 18/20 MS: 1 CrossOver- 00:08:05.107 #52 NEW cov: 12066 ft: 14949 corp: 34/451b lim: 20 exec/s: 52 rss: 68Mb L: 19/20 MS: 1 InsertByte- 00:08:05.107 #53 NEW cov: 12066 ft: 14951 corp: 35/465b lim: 20 exec/s: 53 rss: 68Mb L: 14/20 MS: 1 InsertByte- 00:08:05.107 #54 NEW cov: 12066 ft: 14966 corp: 36/485b lim: 20 exec/s: 54 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:08:05.107 #55 NEW cov: 12066 ft: 14981 corp: 37/504b lim: 20 exec/s: 55 rss: 68Mb L: 19/20 MS: 1 ChangeByte- 00:08:05.367 #56 NEW cov: 12066 ft: 14984 corp: 38/524b lim: 20 exec/s: 56 rss: 68Mb L: 20/20 MS: 1 CrossOver- 00:08:05.367 [2024-12-16 10:55:03.779730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.367 [2024-12-16 10:55:03.779757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.367 #57 NEW cov: 12066 ft: 15026 corp: 39/540b lim: 20 exec/s: 57 rss: 68Mb L: 16/20 MS: 1 ChangeByte- 00:08:05.367 #58 NEW cov: 12066 ft: 15032 corp: 40/558b lim: 20 exec/s: 58 rss: 68Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:05.367 #59 NEW cov: 12066 ft: 15043 corp: 41/578b lim: 20 exec/s: 59 rss: 68Mb L: 20/20 MS: 1 ChangeByte- 00:08:05.367 #60 NEW cov: 12066 ft: 15058 corp: 42/585b lim: 20 exec/s: 60 rss: 68Mb L: 7/20 MS: 1 ShuffleBytes- 00:08:05.367 #61 NEW cov: 12066 ft: 15094 corp: 43/605b lim: 20 exec/s: 61 rss: 68Mb L: 20/20 MS: 1 PersAutoDict- DE: "\000\005S\034\036\251\277*"- 00:08:05.626 #62 NEW cov: 12066 ft: 15098 corp: 44/623b lim: 20 exec/s: 31 rss: 68Mb L: 18/20 MS: 1 CrossOver- 00:08:05.627 #62 DONE cov: 12066 ft: 15098 corp: 44/623b lim: 20 exec/s: 31 rss: 68Mb 00:08:05.627 ###### Recommended dictionary. ###### 00:08:05.627 "\000\005S\034\036\251\277*" # Uses: 2 00:08:05.627 ###### End of recommended dictionary. ###### 00:08:05.627 Done 62 runs in 2 second(s) 00:08:05.627 10:55:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:08:05.627 10:55:04 -- ../common.sh@72 -- # (( i++ )) 00:08:05.627 10:55:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.627 10:55:04 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:05.627 10:55:04 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:05.627 10:55:04 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.627 10:55:04 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.627 10:55:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:05.627 10:55:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:05.627 10:55:04 -- nvmf/run.sh@29 -- # printf %02d 4 00:08:05.627 10:55:04 -- nvmf/run.sh@29 -- # port=4404 00:08:05.627 10:55:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:05.627 10:55:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:05.627 10:55:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.627 10:55:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:08:05.627 [2024-12-16 10:55:04.154928] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:05.627 [2024-12-16 10:55:04.154994] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid652746 ] 00:08:05.627 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.885 [2024-12-16 10:55:04.338499] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.885 [2024-12-16 10:55:04.357964] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.885 [2024-12-16 10:55:04.358103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.885 [2024-12-16 10:55:04.409810] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.885 [2024-12-16 10:55:04.426159] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:05.885 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.885 INFO: Seed: 3255006516 00:08:05.885 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:05.885 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:05.885 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:05.885 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.885 #2 INITED exec/s: 0 rss: 59Mb 00:08:05.885 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.885 This may also happen if the target rejected all inputs we tried so far 00:08:05.885 [2024-12-16 10:55:04.492693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.885 [2024-12-16 10:55:04.492727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.885 [2024-12-16 10:55:04.492845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.886 [2024-12-16 10:55:04.492864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.886 [2024-12-16 10:55:04.492981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.886 [2024-12-16 10:55:04.492999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.404 NEW_FUNC[1/671]: 0x45e3b8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:06.404 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.404 #3 NEW cov: 11601 ft: 11602 corp: 2/24b lim: 35 exec/s: 0 rss: 66Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:06.404 [2024-12-16 10:55:04.813835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.404 [2024-12-16 10:55:04.813882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.404 [2024-12-16 10:55:04.814022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.404 [2024-12-16 10:55:04.814044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.404 [2024-12-16 10:55:04.814176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.404 [2024-12-16 10:55:04.814197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.404 [2024-12-16 10:55:04.814320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.404 [2024-12-16 10:55:04.814342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.404 #6 NEW cov: 11714 ft: 12460 corp: 3/56b lim: 35 exec/s: 0 rss: 66Mb L: 32/32 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:06.404 [2024-12-16 10:55:04.853020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c0a0a cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.404 [2024-12-16 10:55:04.853048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.404 #8 NEW cov: 11720 ft: 13519 corp: 4/66b lim: 35 exec/s: 0 rss: 66Mb L: 10/32 MS: 2 CopyPart-CMP- DE: "_Lh\221\035S\005\000"- 00:08:06.405 [2024-12-16 10:55:04.893803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.893835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.405 [2024-12-16 10:55:04.893959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.893975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.405 [2024-12-16 10:55:04.894093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.894109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.405 [2024-12-16 10:55:04.894223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.894239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.405 #11 NEW cov: 11805 ft: 13750 corp: 5/96b lim: 35 exec/s: 0 rss: 66Mb L: 30/32 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:06.405 [2024-12-16 10:55:04.933957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.933985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.405 [2024-12-16 10:55:04.934105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.934123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.405 [2024-12-16 10:55:04.934232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.934249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.405 [2024-12-16 10:55:04.934361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.934377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.405 #12 NEW cov: 11805 ft: 13819 corp: 6/126b lim: 35 exec/s: 0 rss: 66Mb L: 30/32 MS: 1 ChangeBit- 00:08:06.405 [2024-12-16 10:55:04.983431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a5f0a0a cdw11:4c910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.405 [2024-12-16 10:55:04.983460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.405 #13 NEW cov: 11805 ft: 13905 corp: 7/136b lim: 35 exec/s: 0 rss: 67Mb L: 10/32 MS: 1 CopyPart- 00:08:06.663 [2024-12-16 10:55:05.034078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.034105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.034221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.034238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.034346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.034364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.663 #14 NEW cov: 11805 ft: 13938 corp: 8/159b lim: 35 exec/s: 0 rss: 67Mb L: 23/32 MS: 1 ShuffleBytes- 00:08:06.663 [2024-12-16 10:55:05.084473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.084517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.084637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.084653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.084771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.084786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.084900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.084916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.663 #15 NEW cov: 11805 ft: 14027 corp: 9/187b lim: 35 exec/s: 0 rss: 67Mb L: 28/32 MS: 1 EraseBytes- 00:08:06.663 [2024-12-16 10:55:05.124044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.124071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.124189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.124206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 #16 NEW cov: 11805 ft: 14260 corp: 10/205b lim: 35 exec/s: 0 rss: 67Mb L: 18/32 MS: 1 CrossOver- 00:08:06.663 [2024-12-16 10:55:05.164465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.164492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.164622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.164641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.164755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.164774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.663 #17 NEW cov: 11805 ft: 14290 corp: 11/230b lim: 35 exec/s: 0 rss: 67Mb L: 25/32 MS: 1 CopyPart- 00:08:06.663 [2024-12-16 10:55:05.204577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.204606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.204747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:911d4c68 cdw11:53050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.204766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.204879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.204896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.663 #18 NEW cov: 11805 ft: 14312 corp: 12/253b lim: 35 exec/s: 0 rss: 67Mb L: 23/32 MS: 1 PersAutoDict- DE: "_Lh\221\035S\005\000"- 00:08:06.663 [2024-12-16 10:55:05.244740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5e5e5e31 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.244766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.244884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.244901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.245018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.245034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.663 #19 NEW cov: 11805 ft: 14355 corp: 13/277b lim: 35 exec/s: 0 rss: 67Mb L: 24/32 MS: 1 InsertByte- 00:08:06.663 [2024-12-16 10:55:05.284850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:7e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.284877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.284993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:911d4c68 cdw11:53050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.285011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.663 [2024-12-16 10:55:05.285128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.663 [2024-12-16 10:55:05.285145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 #20 NEW cov: 11805 ft: 14379 corp: 14/300b lim: 35 exec/s: 0 rss: 67Mb L: 23/32 MS: 1 ChangeBit- 00:08:06.922 [2024-12-16 10:55:05.325011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4c685e5f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.325038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.325148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e5e0500 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.325163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.325277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.325293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 #21 NEW cov: 11805 ft: 14455 corp: 15/324b lim: 35 exec/s: 0 rss: 67Mb L: 24/32 MS: 1 PersAutoDict- DE: "_Lh\221\035S\005\000"- 00:08:06.922 [2024-12-16 10:55:05.365071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2cd40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.365098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.365211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3db0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.365237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.365350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.365367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.922 #22 NEW cov: 11828 ft: 14533 corp: 16/349b lim: 35 exec/s: 0 rss: 67Mb L: 25/32 MS: 1 ChangeBinInt- 00:08:06.922 [2024-12-16 10:55:05.414950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2e2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.414976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.415083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.415098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 #23 NEW cov: 11828 ft: 14544 corp: 17/367b lim: 35 exec/s: 0 rss: 67Mb L: 18/32 MS: 1 ChangeByte- 00:08:06.922 [2024-12-16 10:55:05.455659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.455685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.455804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e5305 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.455821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.455933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:68915f4c cdw11:1d530000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.455950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.456063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5e5e005e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.456079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.922 #24 NEW cov: 11828 ft: 14555 corp: 18/398b lim: 35 exec/s: 24 rss: 67Mb L: 31/32 MS: 1 PersAutoDict- DE: "_Lh\221\035S\005\000"- 00:08:06.922 [2024-12-16 10:55:05.495790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.495815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.495930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e5305 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.495948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.496064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:68915f4c cdw11:1d530000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.496080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.496185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:4c68005f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.496201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.922 #25 NEW cov: 11828 ft: 14578 corp: 19/429b lim: 35 exec/s: 25 rss: 68Mb L: 31/32 MS: 1 PersAutoDict- DE: "_Lh\221\035S\005\000"- 00:08:06.922 [2024-12-16 10:55:05.535864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.535890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.536002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e5305 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.536018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.536131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:68915f4c cdw11:1d530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.536147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.922 [2024-12-16 10:55:05.536264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5e00055e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.922 [2024-12-16 10:55:05.536281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.182 #26 NEW cov: 11828 ft: 14631 corp: 20/460b lim: 35 exec/s: 26 rss: 68Mb L: 31/32 MS: 1 ShuffleBytes- 00:08:07.182 [2024-12-16 10:55:05.576041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.576068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.576184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.576200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.576319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:282c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.576334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.576446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.576461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.182 #27 NEW cov: 11828 ft: 14699 corp: 21/490b lim: 35 exec/s: 27 rss: 68Mb L: 30/32 MS: 1 ChangeBit- 00:08:07.182 [2024-12-16 10:55:05.616126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.616153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.616266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.616292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.616405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.616419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.616541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.616558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.182 #28 NEW cov: 11828 ft: 14718 corp: 22/521b lim: 35 exec/s: 28 rss: 68Mb L: 31/32 MS: 1 CopyPart- 00:08:07.182 [2024-12-16 10:55:05.656237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.656267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.656352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e5305 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.656369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.656482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:68915f4c cdw11:1d530003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.656497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.656613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5e05ffff cdw11:5e5e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.656629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.182 #29 NEW cov: 11828 ft: 14726 corp: 23/555b lim: 35 exec/s: 29 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:07.182 [2024-12-16 10:55:05.695864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a5f0a0a cdw11:4c910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.695891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.696018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:4c68535f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.696036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.182 #30 NEW cov: 11828 ft: 14779 corp: 24/573b lim: 35 exec/s: 30 rss: 68Mb L: 18/34 MS: 1 PersAutoDict- DE: "_Lh\221\035S\005\000"- 00:08:07.182 [2024-12-16 10:55:05.745797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4c685e5f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.745826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.182 #31 NEW cov: 11828 ft: 14782 corp: 25/580b lim: 35 exec/s: 31 rss: 68Mb L: 7/34 MS: 1 CrossOver- 00:08:07.182 [2024-12-16 10:55:05.796684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.796714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.796857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.796874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.796992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00040000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.797009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.182 [2024-12-16 10:55:05.797078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.182 [2024-12-16 10:55:05.797093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.441 #32 NEW cov: 11828 ft: 14798 corp: 26/612b lim: 35 exec/s: 32 rss: 68Mb L: 32/34 MS: 1 ChangeBinInt- 00:08:07.441 [2024-12-16 10:55:05.846215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5e5e5e31 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.441 [2024-12-16 10:55:05.846243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.441 [2024-12-16 10:55:05.846361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.441 [2024-12-16 10:55:05.846377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.441 #33 NEW cov: 11828 ft: 14807 corp: 27/630b lim: 35 exec/s: 33 rss: 68Mb L: 18/34 MS: 1 EraseBytes- 00:08:07.441 [2024-12-16 10:55:05.886325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.441 [2024-12-16 10:55:05.886352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.441 [2024-12-16 10:55:05.886468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e05ffff cdw11:5e5e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.886484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.442 #34 NEW cov: 11828 ft: 14817 corp: 28/650b lim: 35 exec/s: 34 rss: 68Mb L: 20/34 MS: 1 EraseBytes- 00:08:07.442 [2024-12-16 10:55:05.936763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.936791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.442 [2024-12-16 10:55:05.936906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.936923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.442 [2024-12-16 10:55:05.937040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.937060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.442 #35 NEW cov: 11828 ft: 14821 corp: 29/676b lim: 35 exec/s: 35 rss: 68Mb L: 26/34 MS: 1 InsertByte- 00:08:07.442 [2024-12-16 10:55:05.977131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.977159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.442 [2024-12-16 10:55:05.977282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.977299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.442 [2024-12-16 10:55:05.977415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.977433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.442 [2024-12-16 10:55:05.977551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:41000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:05.977569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.442 #36 NEW cov: 11828 ft: 14831 corp: 30/708b lim: 35 exec/s: 36 rss: 68Mb L: 32/34 MS: 1 ChangeByte- 00:08:07.442 [2024-12-16 10:55:06.016832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:06.016861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.442 [2024-12-16 10:55:06.016979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5e5effff cdw11:005e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.442 [2024-12-16 10:55:06.016996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.442 #37 NEW cov: 11828 ft: 14845 corp: 31/726b lim: 35 exec/s: 37 rss: 68Mb L: 18/34 MS: 1 EraseBytes- 00:08:07.701 [2024-12-16 10:55:06.066643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.066672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.701 #38 NEW cov: 11828 ft: 14856 corp: 32/734b lim: 35 exec/s: 38 rss: 68Mb L: 8/34 MS: 1 InsertRepeatedBytes- 00:08:07.701 [2024-12-16 10:55:06.106817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a4c5f0a cdw11:0a910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.106844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.701 #39 NEW cov: 11828 ft: 14947 corp: 33/744b lim: 35 exec/s: 39 rss: 68Mb L: 10/34 MS: 1 ShuffleBytes- 00:08:07.701 [2024-12-16 10:55:06.147710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.147740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.701 [2024-12-16 10:55:06.147862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e5305 cdw11:5e300002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.147878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.701 [2024-12-16 10:55:06.148000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:68915f4c cdw11:1d530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.148017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.701 [2024-12-16 10:55:06.148083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5e00055e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.148100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.701 #40 NEW cov: 11828 ft: 14967 corp: 34/775b lim: 35 exec/s: 40 rss: 68Mb L: 31/34 MS: 1 ChangeByte- 00:08:07.701 [2024-12-16 10:55:06.187255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.187283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.701 [2024-12-16 10:55:06.187397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:911d4c68 cdw11:53050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.187416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.701 #41 NEW cov: 11828 ft: 14985 corp: 35/791b lim: 35 exec/s: 41 rss: 68Mb L: 16/34 MS: 1 PersAutoDict- DE: "_Lh\221\035S\005\000"- 00:08:07.701 [2024-12-16 10:55:06.237970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5f4c5e5e cdw11:68910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.701 [2024-12-16 10:55:06.237998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.701 [2024-12-16 10:55:06.238108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e5305 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.238125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.702 [2024-12-16 10:55:06.238239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:68915f4c cdw11:1d530003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.238255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.702 [2024-12-16 10:55:06.238372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5e05ffff cdw11:5e5e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.238390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.702 #42 NEW cov: 11828 ft: 14995 corp: 36/825b lim: 35 exec/s: 42 rss: 68Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:07.702 [2024-12-16 10:55:06.277276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:911d4c68 cdw11:53050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.277304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.702 #43 NEW cov: 11828 ft: 15027 corp: 37/832b lim: 35 exec/s: 43 rss: 68Mb L: 7/34 MS: 1 EraseBytes- 00:08:07.702 [2024-12-16 10:55:06.318142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.318169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.702 [2024-12-16 10:55:06.318291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.318310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.702 [2024-12-16 10:55:06.318421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:282c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.318439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.702 [2024-12-16 10:55:06.318550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.702 [2024-12-16 10:55:06.318568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.961 #44 NEW cov: 11828 ft: 15036 corp: 38/862b lim: 35 exec/s: 44 rss: 69Mb L: 30/34 MS: 1 ShuffleBytes- 00:08:07.961 [2024-12-16 10:55:06.368420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00005e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.368447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.962 [2024-12-16 10:55:06.368557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e0000 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.368574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.962 [2024-12-16 10:55:06.368691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:4c685e5f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.368707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.962 [2024-12-16 10:55:06.368826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5e5e0500 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.368843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.962 #45 NEW cov: 11828 ft: 15047 corp: 39/894b lim: 35 exec/s: 45 rss: 69Mb L: 32/34 MS: 1 InsertRepeatedBytes- 00:08:07.962 [2024-12-16 10:55:06.407979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a5f0a0a cdw11:4c910000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.408007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.962 [2024-12-16 10:55:06.408131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:4c68535f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.408150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.962 #46 NEW cov: 11828 ft: 15055 corp: 40/912b lim: 35 exec/s: 46 rss: 69Mb L: 18/34 MS: 1 ChangeBit- 00:08:07.962 [2024-12-16 10:55:06.448394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00005e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.448421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.962 [2024-12-16 10:55:06.448542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e0000 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.448558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.962 [2024-12-16 10:55:06.448681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:4c685e5f cdw11:911d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.962 [2024-12-16 10:55:06.448700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.962 #47 NEW cov: 11828 ft: 15072 corp: 41/937b lim: 35 exec/s: 23 rss: 69Mb L: 25/34 MS: 1 EraseBytes- 00:08:07.962 #47 DONE cov: 11828 ft: 15072 corp: 41/937b lim: 35 exec/s: 23 rss: 69Mb 00:08:07.962 ###### Recommended dictionary. ###### 00:08:07.962 "_Lh\221\035S\005\000" # Uses: 6 00:08:07.962 ###### End of recommended dictionary. ###### 00:08:07.962 Done 47 runs in 2 second(s) 00:08:07.962 10:55:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:08:08.221 10:55:06 -- ../common.sh@72 -- # (( i++ )) 00:08:08.221 10:55:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.221 10:55:06 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:08.221 10:55:06 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:08.221 10:55:06 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.221 10:55:06 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.221 10:55:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:08.221 10:55:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:08.221 10:55:06 -- nvmf/run.sh@29 -- # printf %02d 5 00:08:08.221 10:55:06 -- nvmf/run.sh@29 -- # port=4405 00:08:08.221 10:55:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:08.221 10:55:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:08.221 10:55:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.221 10:55:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:08:08.221 [2024-12-16 10:55:06.627156] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:08.221 [2024-12-16 10:55:06.627242] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653283 ] 00:08:08.221 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.221 [2024-12-16 10:55:06.806750] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.221 [2024-12-16 10:55:06.826004] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.221 [2024-12-16 10:55:06.826127] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.481 [2024-12-16 10:55:06.877650] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.481 [2024-12-16 10:55:06.893943] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:08.481 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.481 INFO: Seed: 1428035685 00:08:08.481 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:08.481 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:08.481 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:08.481 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.481 #2 INITED exec/s: 0 rss: 59Mb 00:08:08.481 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.481 This may also happen if the target rejected all inputs we tried so far 00:08:08.481 [2024-12-16 10:55:06.949512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.481 [2024-12-16 10:55:06.949540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.481 [2024-12-16 10:55:06.949593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.481 [2024-12-16 10:55:06.949606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.481 [2024-12-16 10:55:06.949667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.481 [2024-12-16 10:55:06.949681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.740 NEW_FUNC[1/671]: 0x460558 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:08.740 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.740 #10 NEW cov: 11612 ft: 11613 corp: 2/35b lim: 45 exec/s: 0 rss: 66Mb L: 34/34 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:08.740 [2024-12-16 10:55:07.260252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.740 [2024-12-16 10:55:07.260283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.740 [2024-12-16 10:55:07.260339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.740 [2024-12-16 10:55:07.260354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.740 [2024-12-16 10:55:07.260410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.740 [2024-12-16 10:55:07.260424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.740 #14 NEW cov: 11725 ft: 12055 corp: 3/69b lim: 45 exec/s: 0 rss: 66Mb L: 34/34 MS: 4 CrossOver-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:08.740 [2024-12-16 10:55:07.300161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.740 [2024-12-16 10:55:07.300187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.740 [2024-12-16 10:55:07.300244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.740 [2024-12-16 10:55:07.300257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.740 #15 NEW cov: 11731 ft: 12613 corp: 4/95b lim: 45 exec/s: 0 rss: 66Mb L: 26/34 MS: 1 EraseBytes- 00:08:08.740 [2024-12-16 10:55:07.340049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffddff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.740 [2024-12-16 10:55:07.340076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 #16 NEW cov: 11816 ft: 13635 corp: 5/112b lim: 45 exec/s: 0 rss: 66Mb L: 17/34 MS: 1 EraseBytes- 00:08:09.000 [2024-12-16 10:55:07.390536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.390562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.390673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.390689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.390743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.390759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.000 #17 NEW cov: 11816 ft: 13733 corp: 6/146b lim: 45 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:09.000 [2024-12-16 10:55:07.430623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.430648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.430701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.430715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.430767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.430780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.000 #18 NEW cov: 11816 ft: 13870 corp: 7/180b lim: 45 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 ChangeByte- 00:08:09.000 [2024-12-16 10:55:07.470906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.470931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.470986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.471000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.471052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.471065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.471117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90909090 cdw11:902b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.471130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.000 #19 NEW cov: 11816 ft: 14218 corp: 8/222b lim: 45 exec/s: 0 rss: 66Mb L: 42/42 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:09.000 [2024-12-16 10:55:07.521024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.521050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.521107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.521120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.521173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.521203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.521257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.521273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.000 #20 NEW cov: 11816 ft: 14271 corp: 9/262b lim: 45 exec/s: 0 rss: 66Mb L: 40/42 MS: 1 CopyPart- 00:08:09.000 [2024-12-16 10:55:07.561146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.561172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.561228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.561241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.561293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.561306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.561360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.561373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.000 #21 NEW cov: 11816 ft: 14306 corp: 10/302b lim: 45 exec/s: 0 rss: 67Mb L: 40/42 MS: 1 ChangeBit- 00:08:09.000 [2024-12-16 10:55:07.601113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.601138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.601194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.601207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.000 [2024-12-16 10:55:07.601261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.000 [2024-12-16 10:55:07.601274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.000 #22 NEW cov: 11816 ft: 14326 corp: 11/336b lim: 45 exec/s: 0 rss: 67Mb L: 34/42 MS: 1 ShuffleBytes- 00:08:09.260 [2024-12-16 10:55:07.641132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.641157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.641213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.641227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.260 #23 NEW cov: 11816 ft: 14368 corp: 12/362b lim: 45 exec/s: 0 rss: 67Mb L: 26/42 MS: 1 ChangeByte- 00:08:09.260 [2024-12-16 10:55:07.681215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.681240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.681295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.681309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.260 #24 NEW cov: 11816 ft: 14451 corp: 13/387b lim: 45 exec/s: 0 rss: 67Mb L: 25/42 MS: 1 EraseBytes- 00:08:09.260 [2024-12-16 10:55:07.721465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.721489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.721544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.721558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.721616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.721629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.260 #25 NEW cov: 11816 ft: 14482 corp: 14/421b lim: 45 exec/s: 0 rss: 67Mb L: 34/42 MS: 1 ChangeBit- 00:08:09.260 [2024-12-16 10:55:07.761588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.761617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.761670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.761684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.761737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.761751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.260 #26 NEW cov: 11816 ft: 14501 corp: 15/454b lim: 45 exec/s: 0 rss: 67Mb L: 33/42 MS: 1 EraseBytes- 00:08:09.260 [2024-12-16 10:55:07.801727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:900c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.801752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.801806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.801819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.801871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.801901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.260 #27 NEW cov: 11816 ft: 14514 corp: 16/488b lim: 45 exec/s: 0 rss: 67Mb L: 34/42 MS: 1 CMP- DE: "\014\000\000\000"- 00:08:09.260 [2024-12-16 10:55:07.841974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000dd00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.842001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.842055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.842069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.842121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.842134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.260 [2024-12-16 10:55:07.842186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.260 [2024-12-16 10:55:07.842199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.260 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.260 #28 NEW cov: 11839 ft: 14575 corp: 17/527b lim: 45 exec/s: 0 rss: 67Mb L: 39/42 MS: 1 InsertRepeatedBytes- 00:08:09.520 [2024-12-16 10:55:07.891975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.892000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.892055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.892069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.892122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0200ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.892135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.520 #29 NEW cov: 11839 ft: 14616 corp: 18/560b lim: 45 exec/s: 0 rss: 67Mb L: 33/42 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:09.520 [2024-12-16 10:55:07.932223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.932247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.932303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.932316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.932370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.932383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.932437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.932450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.520 #30 NEW cov: 11839 ft: 14703 corp: 19/596b lim: 45 exec/s: 30 rss: 67Mb L: 36/42 MS: 1 EraseBytes- 00:08:09.520 [2024-12-16 10:55:07.972183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.972210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.972280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.972294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:07.972348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90905b90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:07.972361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.520 #31 NEW cov: 11839 ft: 14718 corp: 20/630b lim: 45 exec/s: 31 rss: 67Mb L: 34/42 MS: 1 ChangeByte- 00:08:09.520 [2024-12-16 10:55:08.012132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:08.012156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:08.012211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:08.012225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.520 #32 NEW cov: 11839 ft: 14756 corp: 21/654b lim: 45 exec/s: 32 rss: 67Mb L: 24/42 MS: 1 EraseBytes- 00:08:09.520 [2024-12-16 10:55:08.052386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffddff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:08.052413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:08.052467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:08.052481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.520 [2024-12-16 10:55:08.052535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.520 [2024-12-16 10:55:08.052565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.521 #33 NEW cov: 11839 ft: 14783 corp: 22/682b lim: 45 exec/s: 33 rss: 67Mb L: 28/42 MS: 1 CrossOver- 00:08:09.521 [2024-12-16 10:55:08.092197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90900a90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.521 [2024-12-16 10:55:08.092221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.521 #34 NEW cov: 11839 ft: 14803 corp: 23/698b lim: 45 exec/s: 34 rss: 67Mb L: 16/42 MS: 1 CrossOver- 00:08:09.521 [2024-12-16 10:55:08.132601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffddff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.521 [2024-12-16 10:55:08.132628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.521 [2024-12-16 10:55:08.132701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.521 [2024-12-16 10:55:08.132715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.521 [2024-12-16 10:55:08.132772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.521 [2024-12-16 10:55:08.132786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.781 #35 NEW cov: 11839 ft: 14833 corp: 24/726b lim: 45 exec/s: 35 rss: 67Mb L: 28/42 MS: 1 ShuffleBytes- 00:08:09.781 [2024-12-16 10:55:08.172898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.172922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.172976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.172989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.173042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.173055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.173109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.173121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.781 #36 NEW cov: 11839 ft: 14845 corp: 25/764b lim: 45 exec/s: 36 rss: 67Mb L: 38/42 MS: 1 CMP- DE: "\016\000\000\000"- 00:08:09.781 [2024-12-16 10:55:08.213004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:900c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.213028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.213085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90909090 cdw11:90660007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.213098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.213151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:05001f53 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.213164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.213217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.213230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.781 #37 NEW cov: 11839 ft: 14883 corp: 26/806b lim: 45 exec/s: 37 rss: 67Mb L: 42/42 MS: 1 CMP- DE: "f\367\343\342\037S\005\000"- 00:08:09.781 [2024-12-16 10:55:08.253162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.253186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.253241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.253255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.253311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9090900a cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.253324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.253376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.253390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.781 #38 NEW cov: 11839 ft: 14888 corp: 27/843b lim: 45 exec/s: 38 rss: 67Mb L: 37/42 MS: 1 CrossOver- 00:08:09.781 [2024-12-16 10:55:08.293144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.293168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.293222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.293236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.293289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00900000 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.293302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.781 #39 NEW cov: 11839 ft: 14916 corp: 28/877b lim: 45 exec/s: 39 rss: 67Mb L: 34/42 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:09.781 [2024-12-16 10:55:08.333217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.333241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.333298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.333311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.333384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.333398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.781 #40 NEW cov: 11839 ft: 14924 corp: 29/911b lim: 45 exec/s: 40 rss: 67Mb L: 34/42 MS: 1 ShuffleBytes- 00:08:09.781 [2024-12-16 10:55:08.373160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.373185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.781 [2024-12-16 10:55:08.373240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.781 [2024-12-16 10:55:08.373253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.781 #41 NEW cov: 11839 ft: 14978 corp: 30/936b lim: 45 exec/s: 41 rss: 68Mb L: 25/42 MS: 1 EraseBytes- 00:08:10.041 [2024-12-16 10:55:08.413601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.413632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.413687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:c1c1c1c1 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.413701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.413756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fdffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.413769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.413822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.413836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.041 #42 NEW cov: 11839 ft: 15011 corp: 31/974b lim: 45 exec/s: 42 rss: 68Mb L: 38/42 MS: 1 InsertRepeatedBytes- 00:08:10.041 [2024-12-16 10:55:08.453408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.453434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.453489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.453502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.041 #43 NEW cov: 11839 ft: 15037 corp: 32/993b lim: 45 exec/s: 43 rss: 68Mb L: 19/42 MS: 1 InsertRepeatedBytes- 00:08:10.041 [2024-12-16 10:55:08.493854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.493878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.493933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.493946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.494000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.494013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.494068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.494081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.041 #44 NEW cov: 11839 ft: 15050 corp: 33/1031b lim: 45 exec/s: 44 rss: 68Mb L: 38/42 MS: 1 PersAutoDict- DE: "\014\000\000\000"- 00:08:10.041 [2024-12-16 10:55:08.534156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.534182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.534236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.534263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.534317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.534330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.534385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.534398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.534452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.534465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.041 #45 NEW cov: 11839 ft: 15105 corp: 34/1076b lim: 45 exec/s: 45 rss: 68Mb L: 45/45 MS: 1 CopyPart- 00:08:10.041 [2024-12-16 10:55:08.573933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.573958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.041 [2024-12-16 10:55:08.574012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.041 [2024-12-16 10:55:08.574025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.574081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.574094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.042 #46 NEW cov: 11839 ft: 15142 corp: 35/1110b lim: 45 exec/s: 46 rss: 68Mb L: 34/45 MS: 1 ShuffleBytes- 00:08:10.042 [2024-12-16 10:55:08.614169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.614193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.614263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.614277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.614331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0553ff00 cdw11:1fe20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.614344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.614396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.614409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.042 #47 NEW cov: 11839 ft: 15156 corp: 36/1154b lim: 45 exec/s: 47 rss: 68Mb L: 44/45 MS: 1 CMP- DE: "\000\005S\037\342\343\367f"- 00:08:10.042 [2024-12-16 10:55:08.654318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.654346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.654403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90900090 cdw11:90660007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.654417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.654467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:05001f53 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.654481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.042 [2024-12-16 10:55:08.654535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.042 [2024-12-16 10:55:08.654548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.301 #48 NEW cov: 11839 ft: 15170 corp: 37/1196b lim: 45 exec/s: 48 rss: 68Mb L: 42/45 MS: 1 ShuffleBytes- 00:08:10.301 [2024-12-16 10:55:08.694478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.301 [2024-12-16 10:55:08.694502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.694558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.694572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.694630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.694643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.694696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90999090 cdw11:90900001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.694709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.302 #49 NEW cov: 11839 ft: 15172 corp: 38/1239b lim: 45 exec/s: 49 rss: 68Mb L: 43/45 MS: 1 InsertByte- 00:08:10.302 [2024-12-16 10:55:08.734078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffddff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.734102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.302 #50 NEW cov: 11839 ft: 15179 corp: 39/1256b lim: 45 exec/s: 50 rss: 68Mb L: 17/45 MS: 1 ChangeBit- 00:08:10.302 [2024-12-16 10:55:08.774624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90909090 cdw11:90000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.774650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.774705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90900090 cdw11:90be0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.774718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.774770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:05001f53 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.774787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.774841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.774853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.302 #51 NEW cov: 11839 ft: 15181 corp: 40/1298b lim: 45 exec/s: 51 rss: 68Mb L: 42/45 MS: 1 ChangeByte- 00:08:10.302 [2024-12-16 10:55:08.814678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.814703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.814757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffc1c1 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.814771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.814822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.814835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.302 #52 NEW cov: 11839 ft: 15205 corp: 41/1325b lim: 45 exec/s: 52 rss: 68Mb L: 27/45 MS: 1 EraseBytes- 00:08:10.302 [2024-12-16 10:55:08.854961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.854986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.855039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.855051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.855103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.855116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.855167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff02ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.855180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.302 #53 NEW cov: 11839 ft: 15219 corp: 42/1365b lim: 45 exec/s: 53 rss: 68Mb L: 40/45 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:10.302 [2024-12-16 10:55:08.895195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffdd0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.895220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.895288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.895302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.895358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.895371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.895421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.895435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.302 [2024-12-16 10:55:08.895489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.302 [2024-12-16 10:55:08.895502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.302 #54 NEW cov: 11839 ft: 15228 corp: 43/1410b lim: 45 exec/s: 54 rss: 68Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:10.562 [2024-12-16 10:55:08.934924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:90609090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.562 [2024-12-16 10:55:08.934948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.562 [2024-12-16 10:55:08.935000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:90908e90 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.562 [2024-12-16 10:55:08.935014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.562 [2024-12-16 10:55:08.935066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:90909090 cdw11:90900004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.562 [2024-12-16 10:55:08.935079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.562 #55 NEW cov: 11839 ft: 15229 corp: 44/1444b lim: 45 exec/s: 27 rss: 68Mb L: 34/45 MS: 1 ChangeByte- 00:08:10.562 #55 DONE cov: 11839 ft: 15229 corp: 44/1444b lim: 45 exec/s: 27 rss: 68Mb 00:08:10.562 ###### Recommended dictionary. ###### 00:08:10.562 "\002\000\000\000\000\000\000\000" # Uses: 3 00:08:10.562 "\014\000\000\000" # Uses: 1 00:08:10.562 "\016\000\000\000" # Uses: 0 00:08:10.562 "f\367\343\342\037S\005\000" # Uses: 0 00:08:10.562 "\000\005S\037\342\343\367f" # Uses: 0 00:08:10.562 ###### End of recommended dictionary. ###### 00:08:10.562 Done 55 runs in 2 second(s) 00:08:10.562 10:55:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:08:10.562 10:55:09 -- ../common.sh@72 -- # (( i++ )) 00:08:10.562 10:55:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.562 10:55:09 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:10.562 10:55:09 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:10.562 10:55:09 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.562 10:55:09 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.562 10:55:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:10.562 10:55:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:10.562 10:55:09 -- nvmf/run.sh@29 -- # printf %02d 6 00:08:10.562 10:55:09 -- nvmf/run.sh@29 -- # port=4406 00:08:10.562 10:55:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:10.562 10:55:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:10.562 10:55:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.562 10:55:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:08:10.562 [2024-12-16 10:55:09.103946] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:10.562 [2024-12-16 10:55:09.104034] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653707 ] 00:08:10.562 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.821 [2024-12-16 10:55:09.281920] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.821 [2024-12-16 10:55:09.302009] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.821 [2024-12-16 10:55:09.302151] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.821 [2024-12-16 10:55:09.353482] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.821 [2024-12-16 10:55:09.369837] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:10.821 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.821 INFO: Seed: 3902052335 00:08:10.821 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:10.821 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:10.821 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:10.821 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.821 #2 INITED exec/s: 0 rss: 58Mb 00:08:10.822 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.822 This may also happen if the target rejected all inputs we tried so far 00:08:10.822 [2024-12-16 10:55:09.415511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:10.822 [2024-12-16 10:55:09.415538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.822 [2024-12-16 10:55:09.415595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:10.822 [2024-12-16 10:55:09.415612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.822 [2024-12-16 10:55:09.415665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:10.822 [2024-12-16 10:55:09.415678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.822 [2024-12-16 10:55:09.415733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:10.822 [2024-12-16 10:55:09.415746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.822 [2024-12-16 10:55:09.415800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:10.822 [2024-12-16 10:55:09.415813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.391 NEW_FUNC[1/668]: 0x462d68 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:11.391 NEW_FUNC[2/668]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.391 #3 NEW cov: 11512 ft: 11529 corp: 2/11b lim: 10 exec/s: 0 rss: 65Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:11.391 [2024-12-16 10:55:09.725962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.725993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.726061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.726075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.726131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.726144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.391 NEW_FUNC[1/1]: 0xed84a8 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:415 00:08:11.391 #5 NEW cov: 11642 ft: 12235 corp: 3/18b lim: 10 exec/s: 0 rss: 65Mb L: 7/10 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:11.391 [2024-12-16 10:55:09.766175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.766200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.766253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f87 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.766266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.766318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.766331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.766385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000870a cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.766398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.391 #6 NEW cov: 11648 ft: 12495 corp: 4/26b lim: 10 exec/s: 0 rss: 65Mb L: 8/10 MS: 1 InsertByte- 00:08:11.391 [2024-12-16 10:55:09.805988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.806012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.806065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000870a cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.806078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.391 #7 NEW cov: 11733 ft: 13034 corp: 5/30b lim: 10 exec/s: 0 rss: 65Mb L: 4/10 MS: 1 EraseBytes- 00:08:11.391 [2024-12-16 10:55:09.846360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.846384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.846438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f83 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.846452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.846503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.846516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.846568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000870a cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.846581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.391 #8 NEW cov: 11733 ft: 13119 corp: 6/38b lim: 10 exec/s: 0 rss: 65Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:11.391 [2024-12-16 10:55:09.886095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.886122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 #9 NEW cov: 11733 ft: 13461 corp: 7/40b lim: 10 exec/s: 0 rss: 65Mb L: 2/10 MS: 1 CopyPart- 00:08:11.391 [2024-12-16 10:55:09.926186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.926210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 #10 NEW cov: 11733 ft: 13647 corp: 8/42b lim: 10 exec/s: 0 rss: 65Mb L: 2/10 MS: 1 CrossOver- 00:08:11.391 [2024-12-16 10:55:09.966686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.966710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.966763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.966776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.966829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.966842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:09.966891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a2d cdw11:00000000 00:08:11.391 [2024-12-16 10:55:09.966904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.391 #11 NEW cov: 11733 ft: 13739 corp: 9/50b lim: 10 exec/s: 0 rss: 65Mb L: 8/10 MS: 1 InsertByte- 00:08:11.391 [2024-12-16 10:55:10.006709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:10.006733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:10.006786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.391 [2024-12-16 10:55:10.006799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.391 [2024-12-16 10:55:10.006850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000877a cdw11:00000000 00:08:11.391 [2024-12-16 10:55:10.006863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.650 #12 NEW cov: 11733 ft: 13820 corp: 10/57b lim: 10 exec/s: 0 rss: 65Mb L: 7/10 MS: 1 ChangeBinInt- 00:08:11.650 [2024-12-16 10:55:10.046788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006969 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.046813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.650 [2024-12-16 10:55:10.046868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006969 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.046882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.650 [2024-12-16 10:55:10.046936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000690a cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.046950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.650 #13 NEW cov: 11733 ft: 13885 corp: 11/64b lim: 10 exec/s: 0 rss: 65Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:11.650 [2024-12-16 10:55:10.087165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.087189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.650 [2024-12-16 10:55:10.087244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.087258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.650 [2024-12-16 10:55:10.087309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.087321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.650 [2024-12-16 10:55:10.087372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.087385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.650 [2024-12-16 10:55:10.087437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.650 [2024-12-16 10:55:10.087450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.650 #14 NEW cov: 11733 ft: 13921 corp: 12/74b lim: 10 exec/s: 0 rss: 65Mb L: 10/10 MS: 1 ShuffleBytes- 00:08:11.651 [2024-12-16 10:55:10.126934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004ac2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.126959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.127028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.127042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.651 #17 NEW cov: 11733 ft: 13958 corp: 13/79b lim: 10 exec/s: 0 rss: 65Mb L: 5/10 MS: 3 CopyPart-ChangeBit-CrossOver- 00:08:11.651 [2024-12-16 10:55:10.167393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.167418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.167469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.167483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.167536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.167550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.167601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.167618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.167671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.167684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.651 #18 NEW cov: 11733 ft: 13968 corp: 14/89b lim: 10 exec/s: 0 rss: 65Mb L: 10/10 MS: 1 CrossOver- 00:08:11.651 [2024-12-16 10:55:10.207272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.207300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.207352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c24a cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.207366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.207418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.207431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.651 #19 NEW cov: 11733 ft: 13994 corp: 15/96b lim: 10 exec/s: 0 rss: 65Mb L: 7/10 MS: 1 CrossOver- 00:08:11.651 [2024-12-16 10:55:10.247633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.247657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.247709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.247722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.247775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.247789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.247840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.247854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.651 [2024-12-16 10:55:10.247906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e40a cdw11:00000000 00:08:11.651 [2024-12-16 10:55:10.247919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.651 #20 NEW cov: 11733 ft: 14003 corp: 16/106b lim: 10 exec/s: 0 rss: 65Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:11.911 [2024-12-16 10:55:10.277595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.277628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.277682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f87 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.277696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.277747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000878f cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.277760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.277811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000870a cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.277825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.911 #21 NEW cov: 11733 ft: 14024 corp: 17/114b lim: 10 exec/s: 0 rss: 65Mb L: 8/10 MS: 1 ChangeBit- 00:08:11.911 [2024-12-16 10:55:10.317855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.317883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.317935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4a9 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.317949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.318001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.318014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.318065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.318078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.318129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e40a cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.318142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.911 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.911 #22 NEW cov: 11756 ft: 14072 corp: 18/124b lim: 10 exec/s: 0 rss: 66Mb L: 10/10 MS: 1 ChangeByte- 00:08:11.911 [2024-12-16 10:55:10.367985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.368010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.368079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.368093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.368148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.368161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.368214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002fc2 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.368228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.368283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.368296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.911 #23 NEW cov: 11756 ft: 14088 corp: 19/134b lim: 10 exec/s: 0 rss: 66Mb L: 10/10 MS: 1 ChangeByte- 00:08:11.911 [2024-12-16 10:55:10.407730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.407755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.407807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.407821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.407873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008d87 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.407886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.911 #24 NEW cov: 11756 ft: 14170 corp: 20/141b lim: 10 exec/s: 24 rss: 66Mb L: 7/10 MS: 1 ChangeBinInt- 00:08:11.911 [2024-12-16 10:55:10.448055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.448080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.448133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f87 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.448146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.448195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008783 cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.448208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.911 [2024-12-16 10:55:10.448257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000870a cdw11:00000000 00:08:11.911 [2024-12-16 10:55:10.448270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.911 #25 NEW cov: 11756 ft: 14198 corp: 21/149b lim: 10 exec/s: 25 rss: 66Mb L: 8/10 MS: 1 ChangeBit- 00:08:11.912 [2024-12-16 10:55:10.488060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.488084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.912 [2024-12-16 10:55:10.488137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.488150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.912 [2024-12-16 10:55:10.488203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a2d cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.488215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.912 #26 NEW cov: 11756 ft: 14207 corp: 22/155b lim: 10 exec/s: 26 rss: 66Mb L: 6/10 MS: 1 EraseBytes- 00:08:11.912 [2024-12-16 10:55:10.528420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e47e cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.528445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.912 [2024-12-16 10:55:10.528515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4a9 cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.528529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.912 [2024-12-16 10:55:10.528581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.528595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.912 [2024-12-16 10:55:10.528648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.528661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.912 [2024-12-16 10:55:10.528712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e40a cdw11:00000000 00:08:11.912 [2024-12-16 10:55:10.528725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.172 #27 NEW cov: 11756 ft: 14217 corp: 23/165b lim: 10 exec/s: 27 rss: 66Mb L: 10/10 MS: 1 ChangeByte- 00:08:12.172 [2024-12-16 10:55:10.568521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.568545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.568598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.568615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.568683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.568697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.568748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.568761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.568813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c6c2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.568826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.172 #28 NEW cov: 11756 ft: 14238 corp: 24/175b lim: 10 exec/s: 28 rss: 66Mb L: 10/10 MS: 1 ChangeBit- 00:08:12.172 [2024-12-16 10:55:10.608695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e47e cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.608720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.608789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4a9 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.608803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.608855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.608869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.608921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e4ee cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.608935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.608987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e40a cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.609000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.172 #29 NEW cov: 11756 ft: 14301 corp: 25/185b lim: 10 exec/s: 29 rss: 66Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:12.172 [2024-12-16 10:55:10.648555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.648579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.648653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.648667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.648718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000877a cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.648734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.172 #30 NEW cov: 11756 ft: 14308 corp: 26/192b lim: 10 exec/s: 30 rss: 66Mb L: 7/10 MS: 1 ChangeByte- 00:08:12.172 [2024-12-16 10:55:10.688796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.688821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.688890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f83 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.688903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.688956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.688969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.689021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000870a cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.689034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.172 #31 NEW cov: 11756 ft: 14324 corp: 27/201b lim: 10 exec/s: 31 rss: 66Mb L: 9/10 MS: 1 InsertByte- 00:08:12.172 [2024-12-16 10:55:10.728775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.728800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.728850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2ca cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.728864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.728916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.728930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.172 #32 NEW cov: 11756 ft: 14371 corp: 28/208b lim: 10 exec/s: 32 rss: 66Mb L: 7/10 MS: 1 ChangeBit- 00:08:12.172 [2024-12-16 10:55:10.768902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005f87 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.768928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.768982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.768995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.172 [2024-12-16 10:55:10.769049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000870a cdw11:00000000 00:08:12.172 [2024-12-16 10:55:10.769062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.172 #33 NEW cov: 11756 ft: 14380 corp: 29/214b lim: 10 exec/s: 33 rss: 66Mb L: 6/10 MS: 1 EraseBytes- 00:08:12.432 [2024-12-16 10:55:10.808924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.808951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.809005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004ac2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.809022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.432 #34 NEW cov: 11756 ft: 14392 corp: 30/219b lim: 10 exec/s: 34 rss: 66Mb L: 5/10 MS: 1 ShuffleBytes- 00:08:12.432 [2024-12-16 10:55:10.849285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.849309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.849364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.849378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.849429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000cac2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.849442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.849493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.849506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.432 #35 NEW cov: 11756 ft: 14397 corp: 31/227b lim: 10 exec/s: 35 rss: 66Mb L: 8/10 MS: 1 CopyPart- 00:08:12.432 [2024-12-16 10:55:10.889468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.889493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.889545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c6c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.889558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.889607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.889623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.889691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.432 [2024-12-16 10:55:10.889705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.432 [2024-12-16 10:55:10.889759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c6c2 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.889772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.433 #36 NEW cov: 11756 ft: 14441 corp: 32/237b lim: 10 exec/s: 36 rss: 67Mb L: 10/10 MS: 1 CopyPart- 00:08:12.433 [2024-12-16 10:55:10.929595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.929623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:10.929692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.929705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:10.929756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.929770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:10.929818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.929830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:10.929884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e487 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.929898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.433 #37 NEW cov: 11756 ft: 14446 corp: 33/247b lim: 10 exec/s: 37 rss: 67Mb L: 10/10 MS: 1 CrossOver- 00:08:12.433 [2024-12-16 10:55:10.969465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.969489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:10.969540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2ca cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.969554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:10.969605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:10.969622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.433 #38 NEW cov: 11756 ft: 14458 corp: 34/254b lim: 10 exec/s: 38 rss: 67Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:12.433 [2024-12-16 10:55:11.009737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.009761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:11.009829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.009843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:11.009895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.009908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:11.009961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.009975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.433 #39 NEW cov: 11756 ft: 14477 corp: 35/262b lim: 10 exec/s: 39 rss: 67Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:12.433 [2024-12-16 10:55:11.049826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008587 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.049851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:11.049920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f87 cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.049934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:11.049986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000878f cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.049999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.433 [2024-12-16 10:55:11.050055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000870a cdw11:00000000 00:08:12.433 [2024-12-16 10:55:11.050069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.693 #40 NEW cov: 11756 ft: 14478 corp: 36/270b lim: 10 exec/s: 40 rss: 67Mb L: 8/10 MS: 1 ChangeBit- 00:08:12.693 [2024-12-16 10:55:11.090019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.090043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.090113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005f01 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.090127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.090179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000087 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.090192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.090246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.090258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.090310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000870a cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.090323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.693 #41 NEW cov: 11756 ft: 14497 corp: 37/280b lim: 10 exec/s: 41 rss: 67Mb L: 10/10 MS: 1 CMP- DE: "\001\000"- 00:08:12.693 [2024-12-16 10:55:11.120102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c241 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.120126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.120181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c6c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.120194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.120247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.120260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.120314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.120327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.120379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c6c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.120391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.693 #42 NEW cov: 11756 ft: 14517 corp: 38/290b lim: 10 exec/s: 42 rss: 67Mb L: 10/10 MS: 1 ChangeByte- 00:08:12.693 [2024-12-16 10:55:11.160098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.160123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.160177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.160193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.160245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.160258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.160310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.160323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.693 #43 NEW cov: 11756 ft: 14530 corp: 39/299b lim: 10 exec/s: 43 rss: 67Mb L: 9/10 MS: 1 CopyPart- 00:08:12.693 [2024-12-16 10:55:11.200211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.200235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.200289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.200303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.200353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2ca cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.200366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.200418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.200430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.693 #44 NEW cov: 11756 ft: 14533 corp: 40/308b lim: 10 exec/s: 44 rss: 67Mb L: 9/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:12.693 [2024-12-16 10:55:11.240248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.240271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.240325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c2ca cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.240339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.240391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.240404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.693 #45 NEW cov: 11756 ft: 14538 corp: 41/315b lim: 10 exec/s: 45 rss: 67Mb L: 7/10 MS: 1 CopyPart- 00:08:12.693 [2024-12-16 10:55:11.280596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c201 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.280625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.280679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.280693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.280745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.280758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.280812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.280825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.693 [2024-12-16 10:55:11.280876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c6c2 cdw11:00000000 00:08:12.693 [2024-12-16 10:55:11.280889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.693 #46 NEW cov: 11756 ft: 14587 corp: 42/325b lim: 10 exec/s: 46 rss: 67Mb L: 10/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:12.953 [2024-12-16 10:55:11.320715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.320740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.320810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.320824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.320878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.320891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.320944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.320957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.321008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e40a cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.321022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.953 #47 NEW cov: 11756 ft: 14592 corp: 43/335b lim: 10 exec/s: 47 rss: 67Mb L: 10/10 MS: 1 CrossOver- 00:08:12.953 [2024-12-16 10:55:11.350328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ac2 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.350353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.953 #48 NEW cov: 11756 ft: 14622 corp: 44/337b lim: 10 exec/s: 48 rss: 67Mb L: 2/10 MS: 1 CrossOver- 00:08:12.953 [2024-12-16 10:55:11.390807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.390832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.390884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.390898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.390950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008787 cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.390964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.953 [2024-12-16 10:55:11.391015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0d cdw11:00000000 00:08:12.953 [2024-12-16 10:55:11.391029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.953 #49 NEW cov: 11756 ft: 14627 corp: 45/345b lim: 10 exec/s: 24 rss: 67Mb L: 8/10 MS: 1 ChangeBit- 00:08:12.953 #49 DONE cov: 11756 ft: 14627 corp: 45/345b lim: 10 exec/s: 24 rss: 67Mb 00:08:12.953 ###### Recommended dictionary. ###### 00:08:12.953 "\001\000" # Uses: 2 00:08:12.953 ###### End of recommended dictionary. ###### 00:08:12.953 Done 49 runs in 2 second(s) 00:08:12.953 10:55:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:08:12.953 10:55:11 -- ../common.sh@72 -- # (( i++ )) 00:08:12.953 10:55:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.953 10:55:11 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:12.953 10:55:11 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:12.953 10:55:11 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.953 10:55:11 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.953 10:55:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:12.953 10:55:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:12.953 10:55:11 -- nvmf/run.sh@29 -- # printf %02d 7 00:08:12.953 10:55:11 -- nvmf/run.sh@29 -- # port=4407 00:08:12.953 10:55:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:12.953 10:55:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:12.953 10:55:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.953 10:55:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:08:12.953 [2024-12-16 10:55:11.558382] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:12.953 [2024-12-16 10:55:11.558456] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654115 ] 00:08:13.212 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.212 [2024-12-16 10:55:11.734825] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.212 [2024-12-16 10:55:11.754099] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.212 [2024-12-16 10:55:11.754237] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.212 [2024-12-16 10:55:11.805445] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.212 [2024-12-16 10:55:11.821777] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:13.472 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.472 INFO: Seed: 2060079999 00:08:13.472 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:13.472 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:13.472 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:13.472 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.472 #2 INITED exec/s: 0 rss: 59Mb 00:08:13.472 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.472 This may also happen if the target rejected all inputs we tried so far 00:08:13.472 [2024-12-16 10:55:11.866965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.472 [2024-12-16 10:55:11.866995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.731 NEW_FUNC[1/668]: 0x463768 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:13.731 NEW_FUNC[2/668]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.731 #3 NEW cov: 11522 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:08:13.731 [2024-12-16 10:55:12.187709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.187740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.731 NEW_FUNC[1/1]: 0x1967218 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:08:13.731 #4 NEW cov: 11642 ft: 12052 corp: 3/5b lim: 10 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:08:13.731 [2024-12-16 10:55:12.237950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.237977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.731 [2024-12-16 10:55:12.238030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.238043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.731 #5 NEW cov: 11648 ft: 12426 corp: 4/9b lim: 10 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 CrossOver- 00:08:13.731 [2024-12-16 10:55:12.278012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005858 cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.278038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.731 [2024-12-16 10:55:12.278092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.278106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.731 #6 NEW cov: 11733 ft: 12730 corp: 5/14b lim: 10 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 CopyPart- 00:08:13.731 [2024-12-16 10:55:12.318131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.318155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.731 [2024-12-16 10:55:12.318207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.731 [2024-12-16 10:55:12.318221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.731 #7 NEW cov: 11733 ft: 12785 corp: 6/18b lim: 10 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:13.991 [2024-12-16 10:55:12.358152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.358177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 #8 NEW cov: 11733 ft: 12948 corp: 7/20b lim: 10 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 InsertByte- 00:08:13.991 [2024-12-16 10:55:12.398581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.398606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.398664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.398678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.398730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.398742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.398794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.398811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.991 #9 NEW cov: 11733 ft: 13295 corp: 8/28b lim: 10 exec/s: 0 rss: 66Mb L: 8/8 MS: 1 CopyPart- 00:08:13.991 [2024-12-16 10:55:12.438471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005858 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.438496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.438549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.438562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.991 #10 NEW cov: 11733 ft: 13324 corp: 9/33b lim: 10 exec/s: 0 rss: 66Mb L: 5/8 MS: 1 CrossOver- 00:08:13.991 [2024-12-16 10:55:12.478718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005858 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.478742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.478797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.478810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.478862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.478875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.991 #11 NEW cov: 11733 ft: 13496 corp: 10/39b lim: 10 exec/s: 0 rss: 66Mb L: 6/8 MS: 1 CrossOver- 00:08:13.991 [2024-12-16 10:55:12.518950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005858 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.518975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.519041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.519055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.519105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.519118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.519168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.519182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.991 #12 NEW cov: 11733 ft: 13558 corp: 11/47b lim: 10 exec/s: 0 rss: 66Mb L: 8/8 MS: 1 CrossOver- 00:08:13.991 [2024-12-16 10:55:12.559063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.559087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.559140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.559153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.559204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.559223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.559274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff03 cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.559287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.991 #13 NEW cov: 11733 ft: 13585 corp: 12/55b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\003"- 00:08:13.991 [2024-12-16 10:55:12.598937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.598961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.991 [2024-12-16 10:55:12.599015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000180a cdw11:00000000 00:08:13.991 [2024-12-16 10:55:12.599028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 #14 NEW cov: 11733 ft: 13664 corp: 13/59b lim: 10 exec/s: 0 rss: 67Mb L: 4/8 MS: 1 ChangeBit- 00:08:14.251 [2024-12-16 10:55:12.639412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.639436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.639490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.639504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.639555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.639569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.639624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.639653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.639704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff03 cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.639717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.251 #15 NEW cov: 11733 ft: 13719 corp: 14/69b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CopyPart- 00:08:14.251 [2024-12-16 10:55:12.679224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aa8 cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.679247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.679314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ad0a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.679327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 #16 NEW cov: 11733 ft: 13729 corp: 15/73b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 ChangeBinInt- 00:08:14.251 [2024-12-16 10:55:12.719554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.719578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.719633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.719650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.719701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.719731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.719783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.719796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.251 #17 NEW cov: 11733 ft: 13747 corp: 16/81b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 ShuffleBytes- 00:08:14.251 [2024-12-16 10:55:12.759477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.759502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.759554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a8ad cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.759568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.251 #18 NEW cov: 11756 ft: 13797 corp: 17/85b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 CopyPart- 00:08:14.251 [2024-12-16 10:55:12.799712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.799737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.799792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000180a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.799805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.799855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000180a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.799868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.251 #19 NEW cov: 11756 ft: 13819 corp: 18/91b lim: 10 exec/s: 0 rss: 67Mb L: 6/10 MS: 1 CopyPart- 00:08:14.251 [2024-12-16 10:55:12.839707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.839731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.251 [2024-12-16 10:55:12.839785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005858 cdw11:00000000 00:08:14.251 [2024-12-16 10:55:12.839798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.251 #20 NEW cov: 11756 ft: 13855 corp: 19/96b lim: 10 exec/s: 20 rss: 67Mb L: 5/10 MS: 1 ShuffleBytes- 00:08:14.511 [2024-12-16 10:55:12.880198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.880223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:12.880275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.880289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:12.880345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.880358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:12.880409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.880422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:12.880475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.880488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.511 #21 NEW cov: 11756 ft: 13857 corp: 20/106b lim: 10 exec/s: 21 rss: 67Mb L: 10/10 MS: 1 CopyPart- 00:08:14.511 [2024-12-16 10:55:12.919946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002758 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.919970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:12.920038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.920051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.511 #22 NEW cov: 11756 ft: 13868 corp: 21/111b lim: 10 exec/s: 22 rss: 67Mb L: 5/10 MS: 1 ChangeByte- 00:08:14.511 [2024-12-16 10:55:12.959922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:14.511 [2024-12-16 10:55:12.959945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 #23 NEW cov: 11756 ft: 13889 corp: 22/113b lim: 10 exec/s: 23 rss: 67Mb L: 2/10 MS: 1 CrossOver- 00:08:14.511 [2024-12-16 10:55:13.000384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b4b4 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.000409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.000461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b4b4 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.000474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.000527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b40a cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.000555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.000607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a8ad cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.000625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.511 #24 NEW cov: 11756 ft: 13931 corp: 23/122b lim: 10 exec/s: 24 rss: 67Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:14.511 [2024-12-16 10:55:13.040484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000c58 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.040509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.040562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.040576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.040632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.040647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.040699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.040711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.511 #25 NEW cov: 11756 ft: 13956 corp: 24/130b lim: 10 exec/s: 25 rss: 67Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:14.511 [2024-12-16 10:55:13.080406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a27 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.080431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.080483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005858 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.080497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.511 #26 NEW cov: 11756 ft: 13972 corp: 25/135b lim: 10 exec/s: 26 rss: 67Mb L: 5/10 MS: 1 ShuffleBytes- 00:08:14.511 [2024-12-16 10:55:13.120740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.120764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.120818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a8ad cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.120832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.120884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b9b9 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.120897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.511 [2024-12-16 10:55:13.120949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b9b9 cdw11:00000000 00:08:14.511 [2024-12-16 10:55:13.120962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.771 #27 NEW cov: 11756 ft: 14053 corp: 26/143b lim: 10 exec/s: 27 rss: 67Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:08:14.771 [2024-12-16 10:55:13.160914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.160938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.160991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.161005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.161059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005858 cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.161072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.161126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.161138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.771 #28 NEW cov: 11756 ft: 14078 corp: 27/152b lim: 10 exec/s: 28 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:08:14.771 [2024-12-16 10:55:13.201026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.201051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.201104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.201118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.201171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.201184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.201234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff03 cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.201247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.771 #30 NEW cov: 11756 ft: 14101 corp: 28/161b lim: 10 exec/s: 30 rss: 68Mb L: 9/10 MS: 2 EraseBytes-PersAutoDict- DE: "\377\377\377\377\377\377\377\003"- 00:08:14.771 [2024-12-16 10:55:13.240866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000587a cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.240890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.240944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a18 cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.240957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.771 #31 NEW cov: 11756 ft: 14113 corp: 29/166b lim: 10 exec/s: 31 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:08:14.771 [2024-12-16 10:55:13.281164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.281188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.281258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a82d cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.281272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.281324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000adb9 cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.281338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.281391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b9b9 cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.281405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.771 #32 NEW cov: 11756 ft: 14189 corp: 30/175b lim: 10 exec/s: 32 rss: 68Mb L: 9/10 MS: 1 InsertByte- 00:08:14.771 [2024-12-16 10:55:13.321455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002758 cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.321480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.321533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.321546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.321600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.321617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.321667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.321680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.321731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.321744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.771 #33 NEW cov: 11756 ft: 14197 corp: 31/185b lim: 10 exec/s: 33 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:14.771 [2024-12-16 10:55:13.361308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.361332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.361384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:14.771 [2024-12-16 10:55:13.361397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.771 [2024-12-16 10:55:13.361449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000580a cdw11:00000000 00:08:14.772 [2024-12-16 10:55:13.361461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.772 #34 NEW cov: 11756 ft: 14201 corp: 32/191b lim: 10 exec/s: 34 rss: 68Mb L: 6/10 MS: 1 CrossOver- 00:08:15.032 [2024-12-16 10:55:13.401544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a05 cdw11:00000000 00:08:15.032 [2024-12-16 10:55:13.401568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.032 [2024-12-16 10:55:13.401623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a82d cdw11:00000000 00:08:15.032 [2024-12-16 10:55:13.401637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.032 [2024-12-16 10:55:13.401688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000adb9 cdw11:00000000 00:08:15.032 [2024-12-16 10:55:13.401702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.032 [2024-12-16 10:55:13.401752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b9b9 cdw11:00000000 00:08:15.032 [2024-12-16 10:55:13.401765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.032 #35 NEW cov: 11756 ft: 14206 corp: 33/200b lim: 10 exec/s: 35 rss: 68Mb L: 9/10 MS: 1 ChangeByte- 00:08:15.032 [2024-12-16 10:55:13.441468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:15.032 [2024-12-16 10:55:13.441493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.032 [2024-12-16 10:55:13.441544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:15.032 [2024-12-16 10:55:13.441558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.032 #36 NEW cov: 11756 ft: 14208 corp: 34/205b lim: 10 exec/s: 36 rss: 68Mb L: 5/10 MS: 1 CrossOver- 00:08:15.032 [2024-12-16 10:55:13.481821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.481846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.481916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.481930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.481983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.481996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.482046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff58 cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.482059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.033 #37 NEW cov: 11756 ft: 14220 corp: 35/214b lim: 10 exec/s: 37 rss: 68Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:15.033 [2024-12-16 10:55:13.521904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000feff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.521929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.521980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.521994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.522045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005858 cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.522058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.522109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.522122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.033 #38 NEW cov: 11756 ft: 14234 corp: 36/223b lim: 10 exec/s: 38 rss: 68Mb L: 9/10 MS: 1 ChangeBit- 00:08:15.033 [2024-12-16 10:55:13.561789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005805 cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.561814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.561868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005858 cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.561882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.033 #39 NEW cov: 11756 ft: 14248 corp: 37/228b lim: 10 exec/s: 39 rss: 68Mb L: 5/10 MS: 1 ChangeBinInt- 00:08:15.033 [2024-12-16 10:55:13.602032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000058ff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.602057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.602111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.602125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.602177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a58 cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.602193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.033 #40 NEW cov: 11756 ft: 14249 corp: 38/235b lim: 10 exec/s: 40 rss: 68Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:15.033 [2024-12-16 10:55:13.632269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fffe cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.632293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.632348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.632361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.632412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.632425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.632479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.632493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.033 [2024-12-16 10:55:13.632539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff03 cdw11:00000000 00:08:15.033 [2024-12-16 10:55:13.632552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.033 #41 NEW cov: 11756 ft: 14279 corp: 39/245b lim: 10 exec/s: 41 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:08:15.293 [2024-12-16 10:55:13.672207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.672231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.672284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.672298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.672349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000540a cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.672362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.293 #42 NEW cov: 11756 ft: 14300 corp: 40/251b lim: 10 exec/s: 42 rss: 68Mb L: 6/10 MS: 1 ChangeBinInt- 00:08:15.293 [2024-12-16 10:55:13.712193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000180a cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.712217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.712270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.712284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.293 #43 NEW cov: 11756 ft: 14376 corp: 41/256b lim: 10 exec/s: 43 rss: 68Mb L: 5/10 MS: 1 ChangeBit- 00:08:15.293 [2024-12-16 10:55:13.752340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000180a cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.752365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.752421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.752435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.293 #44 NEW cov: 11756 ft: 14381 corp: 42/261b lim: 10 exec/s: 44 rss: 68Mb L: 5/10 MS: 1 ChangeBit- 00:08:15.293 [2024-12-16 10:55:13.792693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.792718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.792772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a8b9 cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.792785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.792839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b9ad cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.792852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.792904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b9b9 cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.792917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.293 #45 NEW cov: 11756 ft: 14407 corp: 43/269b lim: 10 exec/s: 45 rss: 68Mb L: 8/10 MS: 1 ShuffleBytes- 00:08:15.293 [2024-12-16 10:55:13.832539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005858 cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.832563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.293 [2024-12-16 10:55:13.832622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000580a cdw11:00000000 00:08:15.293 [2024-12-16 10:55:13.832636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.293 #46 NEW cov: 11756 ft: 14414 corp: 44/273b lim: 10 exec/s: 23 rss: 69Mb L: 4/10 MS: 1 EraseBytes- 00:08:15.293 #46 DONE cov: 11756 ft: 14414 corp: 44/273b lim: 10 exec/s: 23 rss: 69Mb 00:08:15.293 ###### Recommended dictionary. ###### 00:08:15.293 "\377\377\377\377\377\377\377\003" # Uses: 1 00:08:15.293 ###### End of recommended dictionary. ###### 00:08:15.293 Done 46 runs in 2 second(s) 00:08:15.553 10:55:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:08:15.553 10:55:13 -- ../common.sh@72 -- # (( i++ )) 00:08:15.553 10:55:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.553 10:55:13 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:15.553 10:55:13 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:15.553 10:55:13 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.553 10:55:13 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.553 10:55:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:15.553 10:55:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:15.553 10:55:13 -- nvmf/run.sh@29 -- # printf %02d 8 00:08:15.553 10:55:13 -- nvmf/run.sh@29 -- # port=4408 00:08:15.553 10:55:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:15.553 10:55:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:15.553 10:55:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.553 10:55:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:08:15.553 [2024-12-16 10:55:14.009320] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:15.553 [2024-12-16 10:55:14.009410] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654648 ] 00:08:15.553 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.812 [2024-12-16 10:55:14.190003] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.812 [2024-12-16 10:55:14.209891] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.812 [2024-12-16 10:55:14.210028] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.812 [2024-12-16 10:55:14.261587] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.812 [2024-12-16 10:55:14.277930] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:15.812 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.812 INFO: Seed: 222112832 00:08:15.812 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:15.812 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:15.812 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:15.812 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.812 [2024-12-16 10:55:14.323026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.812 [2024-12-16 10:55:14.323051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.812 #2 INITED cov: 11525 ft: 11556 corp: 1/1b exec/s: 0 rss: 65Mb 00:08:15.812 [2024-12-16 10:55:14.353005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.812 [2024-12-16 10:55:14.353030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.072 NEW_FUNC[1/3]: 0x1c76ac8 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:797 00:08:16.072 NEW_FUNC[2/3]: 0x1c76c68 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1153 00:08:16.072 #3 NEW cov: 11670 ft: 11973 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:08:16.072 [2024-12-16 10:55:14.664583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.072 [2024-12-16 10:55:14.664674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.072 [2024-12-16 10:55:14.664792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.072 [2024-12-16 10:55:14.664833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.331 #4 NEW cov: 11676 ft: 13225 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:08:16.331 [2024-12-16 10:55:14.724212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.331 [2024-12-16 10:55:14.724239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.331 [2024-12-16 10:55:14.724293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.331 [2024-12-16 10:55:14.724307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.331 #5 NEW cov: 11761 ft: 13511 corp: 4/6b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:08:16.331 [2024-12-16 10:55:14.764279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.331 [2024-12-16 10:55:14.764305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.764361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.764375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.332 #6 NEW cov: 11761 ft: 13620 corp: 5/8b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:08:16.332 [2024-12-16 10:55:14.804542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.804566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.804623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.804653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.804710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.804723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.332 #7 NEW cov: 11761 ft: 13910 corp: 6/11b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 CopyPart- 00:08:16.332 [2024-12-16 10:55:14.844652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.844676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.844747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.844761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.844817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.844830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.332 #8 NEW cov: 11761 ft: 13975 corp: 7/14b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 InsertByte- 00:08:16.332 [2024-12-16 10:55:14.884797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.884822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.884894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.884908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.884963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.884976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.332 #9 NEW cov: 11761 ft: 13997 corp: 8/17b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 ShuffleBytes- 00:08:16.332 [2024-12-16 10:55:14.924764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.924789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.332 [2024-12-16 10:55:14.924845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.332 [2024-12-16 10:55:14.924859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.332 #10 NEW cov: 11761 ft: 14017 corp: 9/19b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 InsertByte- 00:08:16.592 [2024-12-16 10:55:14.965063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:14.965088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:14.965146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:14.965159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:14.965213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:14.965228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.592 #11 NEW cov: 11761 ft: 14068 corp: 10/22b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 ChangeByte- 00:08:16.592 [2024-12-16 10:55:15.005154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.005180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:15.005236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.005250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:15.005308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.005322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.592 #12 NEW cov: 11761 ft: 14167 corp: 11/25b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 ChangeBit- 00:08:16.592 [2024-12-16 10:55:15.045125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.045151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:15.045207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.045221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.592 #13 NEW cov: 11761 ft: 14197 corp: 12/27b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 ShuffleBytes- 00:08:16.592 [2024-12-16 10:55:15.085078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.085103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.592 #14 NEW cov: 11761 ft: 14263 corp: 13/28b lim: 5 exec/s: 0 rss: 66Mb L: 1/3 MS: 1 EraseBytes- 00:08:16.592 [2024-12-16 10:55:15.125358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.125383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:15.125454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.125469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.592 #15 NEW cov: 11761 ft: 14282 corp: 14/30b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 ShuffleBytes- 00:08:16.592 [2024-12-16 10:55:15.165626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.165651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:15.165709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.165734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.592 [2024-12-16 10:55:15.165788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.165801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.592 #16 NEW cov: 11761 ft: 14330 corp: 15/33b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 ChangeByte- 00:08:16.592 [2024-12-16 10:55:15.205459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.592 [2024-12-16 10:55:15.205484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.851 #17 NEW cov: 11784 ft: 14367 corp: 16/34b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ShuffleBytes- 00:08:16.851 [2024-12-16 10:55:15.245714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.245740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.245796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.245809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.851 #18 NEW cov: 11784 ft: 14406 corp: 17/36b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 EraseBytes- 00:08:16.851 [2024-12-16 10:55:15.285860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.285886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.285947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.285960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.851 #19 NEW cov: 11784 ft: 14415 corp: 18/38b lim: 5 exec/s: 19 rss: 67Mb L: 2/3 MS: 1 ChangeBit- 00:08:16.851 [2024-12-16 10:55:15.326009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.326035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.326094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.326108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.851 #20 NEW cov: 11784 ft: 14426 corp: 19/40b lim: 5 exec/s: 20 rss: 67Mb L: 2/3 MS: 1 EraseBytes- 00:08:16.851 [2024-12-16 10:55:15.366169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.366195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.366252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.366266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.366321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.366335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.851 #21 NEW cov: 11784 ft: 14448 corp: 20/43b lim: 5 exec/s: 21 rss: 67Mb L: 3/3 MS: 1 CrossOver- 00:08:16.851 [2024-12-16 10:55:15.406209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.406234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.406289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.406304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.851 #22 NEW cov: 11784 ft: 14468 corp: 21/45b lim: 5 exec/s: 22 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:08:16.851 [2024-12-16 10:55:15.446548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.446572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.446651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.446665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.446721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.446738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.851 [2024-12-16 10:55:15.446792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.851 [2024-12-16 10:55:15.446805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.851 #23 NEW cov: 11784 ft: 14768 corp: 22/49b lim: 5 exec/s: 23 rss: 67Mb L: 4/4 MS: 1 CrossOver- 00:08:17.111 [2024-12-16 10:55:15.486437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.111 [2024-12-16 10:55:15.486462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.111 [2024-12-16 10:55:15.486534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.486548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 #24 NEW cov: 11784 ft: 14809 corp: 23/51b lim: 5 exec/s: 24 rss: 67Mb L: 2/4 MS: 1 ChangeBit- 00:08:17.112 [2024-12-16 10:55:15.526532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.526557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.526617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.526630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 #25 NEW cov: 11784 ft: 14822 corp: 24/53b lim: 5 exec/s: 25 rss: 67Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:17.112 [2024-12-16 10:55:15.566796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.566821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.566879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.566892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.566950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.566963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.112 #26 NEW cov: 11784 ft: 14829 corp: 25/56b lim: 5 exec/s: 26 rss: 67Mb L: 3/4 MS: 1 ChangeBit- 00:08:17.112 [2024-12-16 10:55:15.607087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.607111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.607183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.607196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.607255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.607269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.607326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.607339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.112 #27 NEW cov: 11784 ft: 14882 corp: 26/60b lim: 5 exec/s: 27 rss: 67Mb L: 4/4 MS: 1 CMP- DE: "\377~"- 00:08:17.112 [2024-12-16 10:55:15.646956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.646981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.647037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.647051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 #28 NEW cov: 11784 ft: 14900 corp: 27/62b lim: 5 exec/s: 28 rss: 67Mb L: 2/4 MS: 1 ChangeByte- 00:08:17.112 [2024-12-16 10:55:15.686993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.687018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.687090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.687104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 #29 NEW cov: 11784 ft: 14913 corp: 28/64b lim: 5 exec/s: 29 rss: 67Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:17.112 [2024-12-16 10:55:15.727286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.727310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.727382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.727396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.112 [2024-12-16 10:55:15.727451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.112 [2024-12-16 10:55:15.727464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.372 #30 NEW cov: 11784 ft: 14916 corp: 29/67b lim: 5 exec/s: 30 rss: 67Mb L: 3/4 MS: 1 PersAutoDict- DE: "\377~"- 00:08:17.372 [2024-12-16 10:55:15.767378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.767403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.372 [2024-12-16 10:55:15.767459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.767476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.372 [2024-12-16 10:55:15.767548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.767562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.372 #31 NEW cov: 11784 ft: 14937 corp: 30/70b lim: 5 exec/s: 31 rss: 67Mb L: 3/4 MS: 1 ChangeByte- 00:08:17.372 [2024-12-16 10:55:15.807336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.807361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.372 [2024-12-16 10:55:15.807417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.807430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.372 #32 NEW cov: 11784 ft: 14961 corp: 31/72b lim: 5 exec/s: 32 rss: 67Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:17.372 [2024-12-16 10:55:15.847287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.847311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.372 #33 NEW cov: 11784 ft: 15032 corp: 32/73b lim: 5 exec/s: 33 rss: 67Mb L: 1/4 MS: 1 EraseBytes- 00:08:17.372 [2024-12-16 10:55:15.887449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.887473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.372 #34 NEW cov: 11784 ft: 15040 corp: 33/74b lim: 5 exec/s: 34 rss: 67Mb L: 1/4 MS: 1 EraseBytes- 00:08:17.372 [2024-12-16 10:55:15.927721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.927746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.372 [2024-12-16 10:55:15.927803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.927817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.372 #35 NEW cov: 11784 ft: 15051 corp: 34/76b lim: 5 exec/s: 35 rss: 68Mb L: 2/4 MS: 1 EraseBytes- 00:08:17.372 [2024-12-16 10:55:15.967825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.967850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.372 [2024-12-16 10:55:15.967908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.372 [2024-12-16 10:55:15.967922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.372 #36 NEW cov: 11784 ft: 15056 corp: 35/78b lim: 5 exec/s: 36 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:08:17.632 [2024-12-16 10:55:16.007817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.007845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 #37 NEW cov: 11784 ft: 15102 corp: 36/79b lim: 5 exec/s: 37 rss: 68Mb L: 1/4 MS: 1 ChangeBit- 00:08:17.632 [2024-12-16 10:55:16.048035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.048059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.048116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.048130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.632 #38 NEW cov: 11784 ft: 15140 corp: 37/81b lim: 5 exec/s: 38 rss: 68Mb L: 2/4 MS: 1 CrossOver- 00:08:17.632 [2024-12-16 10:55:16.088188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.088212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.088284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.088298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.632 #39 NEW cov: 11784 ft: 15162 corp: 38/83b lim: 5 exec/s: 39 rss: 68Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:17.632 [2024-12-16 10:55:16.128317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.128341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.128398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.128412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.632 #40 NEW cov: 11784 ft: 15173 corp: 39/85b lim: 5 exec/s: 40 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:08:17.632 [2024-12-16 10:55:16.168267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.168291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 #41 NEW cov: 11784 ft: 15175 corp: 40/86b lim: 5 exec/s: 41 rss: 68Mb L: 1/4 MS: 1 ChangeBinInt- 00:08:17.632 [2024-12-16 10:55:16.208509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.208534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.208593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.208607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.632 #42 NEW cov: 11784 ft: 15280 corp: 41/88b lim: 5 exec/s: 42 rss: 68Mb L: 2/4 MS: 1 ChangeBit- 00:08:17.632 [2024-12-16 10:55:16.248952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.248979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.249035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.249049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.249105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.249118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.632 [2024-12-16 10:55:16.249173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.632 [2024-12-16 10:55:16.249186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.891 #43 NEW cov: 11784 ft: 15290 corp: 42/92b lim: 5 exec/s: 43 rss: 68Mb L: 4/4 MS: 1 CrossOver- 00:08:17.891 [2024-12-16 10:55:16.288587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.891 [2024-12-16 10:55:16.288616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.891 #44 NEW cov: 11784 ft: 15298 corp: 43/93b lim: 5 exec/s: 22 rss: 68Mb L: 1/4 MS: 1 EraseBytes- 00:08:17.891 #44 DONE cov: 11784 ft: 15298 corp: 43/93b lim: 5 exec/s: 22 rss: 68Mb 00:08:17.891 ###### Recommended dictionary. ###### 00:08:17.891 "\377~" # Uses: 1 00:08:17.891 ###### End of recommended dictionary. ###### 00:08:17.891 Done 44 runs in 2 second(s) 00:08:17.891 10:55:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:08:17.891 10:55:16 -- ../common.sh@72 -- # (( i++ )) 00:08:17.891 10:55:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.891 10:55:16 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:17.891 10:55:16 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:17.891 10:55:16 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.891 10:55:16 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.891 10:55:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:17.892 10:55:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:17.892 10:55:16 -- nvmf/run.sh@29 -- # printf %02d 9 00:08:17.892 10:55:16 -- nvmf/run.sh@29 -- # port=4409 00:08:17.892 10:55:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:17.892 10:55:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:17.892 10:55:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.892 10:55:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:08:17.892 [2024-12-16 10:55:16.465340] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:17.892 [2024-12-16 10:55:16.465430] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654946 ] 00:08:17.892 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.151 [2024-12-16 10:55:16.649383] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.151 [2024-12-16 10:55:16.668630] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.151 [2024-12-16 10:55:16.668769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.151 [2024-12-16 10:55:16.720265] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.151 [2024-12-16 10:55:16.736599] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:18.151 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.151 INFO: Seed: 2680129742 00:08:18.410 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:18.410 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:18.410 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:18.410 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.410 [2024-12-16 10:55:16.802517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.802553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.410 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 64Mb 00:08:18.410 [2024-12-16 10:55:16.842804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.842835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.410 [2024-12-16 10:55:16.842950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.842967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.410 #3 NEW cov: 11670 ft: 12898 corp: 2/3b lim: 5 exec/s: 0 rss: 65Mb L: 2/2 MS: 1 CopyPart- 00:08:18.410 [2024-12-16 10:55:16.892894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.892924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.410 [2024-12-16 10:55:16.893037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.893056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.410 #4 NEW cov: 11676 ft: 13076 corp: 3/5b lim: 5 exec/s: 0 rss: 65Mb L: 2/2 MS: 1 ShuffleBytes- 00:08:18.410 [2024-12-16 10:55:16.933617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.933644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.410 [2024-12-16 10:55:16.933769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.933787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.410 [2024-12-16 10:55:16.933911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.933927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.410 [2024-12-16 10:55:16.934050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.410 [2024-12-16 10:55:16.934071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.410 #5 NEW cov: 11761 ft: 13575 corp: 4/9b lim: 5 exec/s: 0 rss: 65Mb L: 4/4 MS: 1 CopyPart- 00:08:18.410 [2024-12-16 10:55:16.983241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.411 [2024-12-16 10:55:16.983268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.411 [2024-12-16 10:55:16.983394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.411 [2024-12-16 10:55:16.983411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.411 #6 NEW cov: 11761 ft: 13708 corp: 5/11b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 CrossOver- 00:08:18.411 [2024-12-16 10:55:17.023311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.411 [2024-12-16 10:55:17.023339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.411 [2024-12-16 10:55:17.023457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.411 [2024-12-16 10:55:17.023475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.670 #7 NEW cov: 11761 ft: 13787 corp: 6/13b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 ChangeByte- 00:08:18.670 [2024-12-16 10:55:17.063331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.063358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.063475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.063493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.670 #8 NEW cov: 11761 ft: 13862 corp: 7/15b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 CrossOver- 00:08:18.670 [2024-12-16 10:55:17.103485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.103514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.103633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.103662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.670 #9 NEW cov: 11761 ft: 13888 corp: 8/17b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 ChangeBit- 00:08:18.670 [2024-12-16 10:55:17.144114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.144141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.144266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.144282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.144405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.144423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.144541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.144558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.670 #10 NEW cov: 11761 ft: 13918 corp: 9/21b lim: 5 exec/s: 0 rss: 65Mb L: 4/4 MS: 1 CopyPart- 00:08:18.670 [2024-12-16 10:55:17.183779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.183807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.183932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.183949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.670 #11 NEW cov: 11761 ft: 14039 corp: 10/23b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 ChangeBit- 00:08:18.670 [2024-12-16 10:55:17.224404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.224430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.670 [2024-12-16 10:55:17.224549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.670 [2024-12-16 10:55:17.224566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.671 [2024-12-16 10:55:17.224678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.671 [2024-12-16 10:55:17.224694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.671 [2024-12-16 10:55:17.224805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.671 [2024-12-16 10:55:17.224822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.671 #12 NEW cov: 11761 ft: 14071 corp: 11/27b lim: 5 exec/s: 0 rss: 65Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:18.671 [2024-12-16 10:55:17.274055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.671 [2024-12-16 10:55:17.274080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.671 [2024-12-16 10:55:17.274209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.671 [2024-12-16 10:55:17.274226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.930 #13 NEW cov: 11761 ft: 14120 corp: 12/29b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:18.930 [2024-12-16 10:55:17.314183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.930 [2024-12-16 10:55:17.314214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.930 [2024-12-16 10:55:17.314329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.930 [2024-12-16 10:55:17.314348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.930 #14 NEW cov: 11761 ft: 14195 corp: 13/31b lim: 5 exec/s: 0 rss: 65Mb L: 2/4 MS: 1 CrossOver- 00:08:18.930 [2024-12-16 10:55:17.364819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.930 [2024-12-16 10:55:17.364847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.930 [2024-12-16 10:55:17.364965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.930 [2024-12-16 10:55:17.364982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.930 [2024-12-16 10:55:17.365106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.930 [2024-12-16 10:55:17.365122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.365241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.365258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.931 #15 NEW cov: 11761 ft: 14219 corp: 14/35b lim: 5 exec/s: 0 rss: 65Mb L: 4/4 MS: 1 CopyPart- 00:08:18.931 [2024-12-16 10:55:17.405269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.405297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.405415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.405433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.405549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.405566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.405693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.405709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.405831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.405847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.931 #16 NEW cov: 11761 ft: 14320 corp: 15/40b lim: 5 exec/s: 0 rss: 65Mb L: 5/5 MS: 1 InsertByte- 00:08:18.931 [2024-12-16 10:55:17.455142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.455172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.455287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.455304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.455423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.455441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.455566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.455582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.931 #17 NEW cov: 11761 ft: 14382 corp: 16/44b lim: 5 exec/s: 0 rss: 65Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:18.931 [2024-12-16 10:55:17.494912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.494939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.495057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.495072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.495181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.495199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.931 #18 NEW cov: 11761 ft: 14551 corp: 17/47b lim: 5 exec/s: 0 rss: 65Mb L: 3/5 MS: 1 CrossOver- 00:08:18.931 [2024-12-16 10:55:17.535133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.535159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.535282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.535299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.931 [2024-12-16 10:55:17.535417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.931 [2024-12-16 10:55:17.535435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.191 #19 NEW cov: 11761 ft: 14577 corp: 18/50b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 ChangeByte- 00:08:19.191 [2024-12-16 10:55:17.584763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.191 [2024-12-16 10:55:17.584790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.191 #20 NEW cov: 11761 ft: 14617 corp: 19/51b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 EraseBytes- 00:08:19.191 [2024-12-16 10:55:17.624898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.191 [2024-12-16 10:55:17.624927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.191 #21 NEW cov: 11761 ft: 14629 corp: 20/52b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 EraseBytes- 00:08:19.191 [2024-12-16 10:55:17.665265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.191 [2024-12-16 10:55:17.665292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.191 [2024-12-16 10:55:17.665411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.191 [2024-12-16 10:55:17.665428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.450 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.450 #22 NEW cov: 11784 ft: 14672 corp: 21/54b lim: 5 exec/s: 22 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:08:19.450 [2024-12-16 10:55:17.976155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:17.976191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.450 [2024-12-16 10:55:17.976316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:17.976331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.450 #23 NEW cov: 11784 ft: 14743 corp: 22/56b lim: 5 exec/s: 23 rss: 67Mb L: 2/5 MS: 1 CrossOver- 00:08:19.450 [2024-12-16 10:55:18.026775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:18.026805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.450 [2024-12-16 10:55:18.026927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:18.026945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.450 [2024-12-16 10:55:18.027063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:18.027081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.450 [2024-12-16 10:55:18.027206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:18.027225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.450 #24 NEW cov: 11784 ft: 14756 corp: 23/60b lim: 5 exec/s: 24 rss: 67Mb L: 4/5 MS: 1 InsertByte- 00:08:19.450 [2024-12-16 10:55:18.066286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:18.066315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.450 [2024-12-16 10:55:18.066425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.450 [2024-12-16 10:55:18.066442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.710 #25 NEW cov: 11784 ft: 14784 corp: 24/62b lim: 5 exec/s: 25 rss: 67Mb L: 2/5 MS: 1 ChangeByte- 00:08:19.710 [2024-12-16 10:55:18.106483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.106511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.106633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.106653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.710 #26 NEW cov: 11784 ft: 14787 corp: 25/64b lim: 5 exec/s: 26 rss: 67Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:19.710 [2024-12-16 10:55:18.146834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.146872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.146992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.147010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.147133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.147151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.710 #27 NEW cov: 11784 ft: 14797 corp: 26/67b lim: 5 exec/s: 27 rss: 67Mb L: 3/5 MS: 1 InsertByte- 00:08:19.710 [2024-12-16 10:55:18.196817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.196846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.196962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.196979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.710 #28 NEW cov: 11784 ft: 14814 corp: 27/69b lim: 5 exec/s: 28 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:08:19.710 [2024-12-16 10:55:18.246999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.247026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.247145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.247163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.710 #29 NEW cov: 11784 ft: 14826 corp: 28/71b lim: 5 exec/s: 29 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:08:19.710 [2024-12-16 10:55:18.297906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.297936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.298057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.298074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.298190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.298208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.298321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.298338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.710 [2024-12-16 10:55:18.298451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.710 [2024-12-16 10:55:18.298468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.710 #30 NEW cov: 11784 ft: 14862 corp: 29/76b lim: 5 exec/s: 30 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:08:19.970 [2024-12-16 10:55:18.337241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.337271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.337390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.337409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.970 #31 NEW cov: 11784 ft: 14876 corp: 30/78b lim: 5 exec/s: 31 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:19.970 [2024-12-16 10:55:18.377865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.377893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.378020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.378038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.378156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.378175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.970 #32 NEW cov: 11784 ft: 14963 corp: 31/81b lim: 5 exec/s: 32 rss: 68Mb L: 3/5 MS: 1 CrossOver- 00:08:19.970 [2024-12-16 10:55:18.417253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.417281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 #33 NEW cov: 11784 ft: 15009 corp: 32/82b lim: 5 exec/s: 33 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:08:19.970 [2024-12-16 10:55:18.457592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.457623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.457742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.457760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.970 #34 NEW cov: 11784 ft: 15021 corp: 33/84b lim: 5 exec/s: 34 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:08:19.970 [2024-12-16 10:55:18.497998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.498025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.498139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.498158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.498281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.498296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.970 #35 NEW cov: 11784 ft: 15028 corp: 34/87b lim: 5 exec/s: 35 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:08:19.970 [2024-12-16 10:55:18.538173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.538200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.538315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.538331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.538451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.538468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.970 #36 NEW cov: 11784 ft: 15032 corp: 35/90b lim: 5 exec/s: 36 rss: 68Mb L: 3/5 MS: 1 CopyPart- 00:08:19.970 [2024-12-16 10:55:18.578486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.578513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.578642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.578660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.578779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.578800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.970 [2024-12-16 10:55:18.578922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.970 [2024-12-16 10:55:18.578939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.230 #37 NEW cov: 11784 ft: 15033 corp: 36/94b lim: 5 exec/s: 37 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:08:20.230 [2024-12-16 10:55:18.618490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.618516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.618649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.618666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.618784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.618804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.618922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.618938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.230 #38 NEW cov: 11784 ft: 15046 corp: 37/98b lim: 5 exec/s: 38 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:20.230 [2024-12-16 10:55:18.668234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.668262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.668396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.668412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.230 #39 NEW cov: 11784 ft: 15063 corp: 38/100b lim: 5 exec/s: 39 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:20.230 [2024-12-16 10:55:18.708117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.708144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.230 #40 NEW cov: 11784 ft: 15076 corp: 39/101b lim: 5 exec/s: 40 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:08:20.230 [2024-12-16 10:55:18.749250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.749276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.749407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.749427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.749542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.749564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.749688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.749706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.230 [2024-12-16 10:55:18.749830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.230 [2024-12-16 10:55:18.749848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.230 #41 NEW cov: 11784 ft: 15079 corp: 40/106b lim: 5 exec/s: 41 rss: 68Mb L: 5/5 MS: 1 InsertByte- 00:08:20.230 [2024-12-16 10:55:18.788824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.231 [2024-12-16 10:55:18.788851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.231 [2024-12-16 10:55:18.788970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.231 [2024-12-16 10:55:18.788988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.231 [2024-12-16 10:55:18.789109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.231 [2024-12-16 10:55:18.789126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.231 #42 NEW cov: 11784 ft: 15096 corp: 41/109b lim: 5 exec/s: 21 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:08:20.231 #42 DONE cov: 11784 ft: 15096 corp: 41/109b lim: 5 exec/s: 21 rss: 68Mb 00:08:20.231 Done 42 runs in 2 second(s) 00:08:20.490 10:55:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:08:20.490 10:55:18 -- ../common.sh@72 -- # (( i++ )) 00:08:20.490 10:55:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.490 10:55:18 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:20.490 10:55:18 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:20.490 10:55:18 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.490 10:55:18 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.490 10:55:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:20.490 10:55:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:20.490 10:55:18 -- nvmf/run.sh@29 -- # printf %02d 10 00:08:20.490 10:55:18 -- nvmf/run.sh@29 -- # port=4410 00:08:20.490 10:55:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:20.490 10:55:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:20.490 10:55:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.490 10:55:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:08:20.490 [2024-12-16 10:55:18.966548] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:20.490 [2024-12-16 10:55:18.966618] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655484 ] 00:08:20.490 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.750 [2024-12-16 10:55:19.142272] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.750 [2024-12-16 10:55:19.161492] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.750 [2024-12-16 10:55:19.161617] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.750 [2024-12-16 10:55:19.212963] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.750 [2024-12-16 10:55:19.229245] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:20.750 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.750 INFO: Seed: 878151741 00:08:20.750 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:20.750 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:20.750 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:20.750 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.750 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.750 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.750 This may also happen if the target rejected all inputs we tried so far 00:08:20.750 [2024-12-16 10:55:19.274604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.750 [2024-12-16 10:55:19.274636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.750 [2024-12-16 10:55:19.274695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.750 [2024-12-16 10:55:19.274708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.750 [2024-12-16 10:55:19.274761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.750 [2024-12-16 10:55:19.274774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.009 NEW_FUNC[1/669]: 0x4650e8 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:21.009 NEW_FUNC[2/669]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.009 #6 NEW cov: 11579 ft: 11581 corp: 2/32b lim: 40 exec/s: 0 rss: 66Mb L: 31/31 MS: 4 ChangeBinInt-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:21.009 [2024-12-16 10:55:19.585449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.009 [2024-12-16 10:55:19.585480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.010 [2024-12-16 10:55:19.585542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.010 [2024-12-16 10:55:19.585557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.010 [2024-12-16 10:55:19.585618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.010 [2024-12-16 10:55:19.585632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.010 NEW_FUNC[1/1]: 0x1c7b168 in thread_update_stats /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:920 00:08:21.010 #7 NEW cov: 11693 ft: 11994 corp: 3/63b lim: 40 exec/s: 0 rss: 66Mb L: 31/31 MS: 1 ShuffleBytes- 00:08:21.269 [2024-12-16 10:55:19.635416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.269 [2024-12-16 10:55:19.635444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.269 [2024-12-16 10:55:19.635505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.635520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 #13 NEW cov: 11699 ft: 12516 corp: 4/79b lim: 40 exec/s: 0 rss: 66Mb L: 16/31 MS: 1 CrossOver- 00:08:21.270 [2024-12-16 10:55:19.675587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.675616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.675692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.675706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.675763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.675778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.270 #14 NEW cov: 11784 ft: 12714 corp: 5/110b lim: 40 exec/s: 0 rss: 66Mb L: 31/31 MS: 1 ChangeBit- 00:08:21.270 [2024-12-16 10:55:19.715762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.715788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.715848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54de5454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.715862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.715918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.715931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.270 #15 NEW cov: 11784 ft: 12806 corp: 6/141b lim: 40 exec/s: 0 rss: 66Mb L: 31/31 MS: 1 ChangeByte- 00:08:21.270 [2024-12-16 10:55:19.756136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54efefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.756161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.756224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efef5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.756238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.756296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:44545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.756310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.756371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.756385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.756460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.756474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.270 #16 NEW cov: 11784 ft: 13484 corp: 7/181b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:21.270 [2024-12-16 10:55:19.795837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545441 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.795862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.795922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.795935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 #17 NEW cov: 11784 ft: 13664 corp: 8/198b lim: 40 exec/s: 0 rss: 66Mb L: 17/40 MS: 1 InsertByte- 00:08:21.270 [2024-12-16 10:55:19.835986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545441 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.836012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.836070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.836084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 #18 NEW cov: 11784 ft: 13712 corp: 9/215b lim: 40 exec/s: 0 rss: 66Mb L: 17/40 MS: 1 ShuffleBytes- 00:08:21.270 [2024-12-16 10:55:19.876479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54efefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.876505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.876566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efef5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.876581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.876656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:03000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.876671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.876730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.876744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.270 [2024-12-16 10:55:19.876803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.270 [2024-12-16 10:55:19.876820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.530 #19 NEW cov: 11784 ft: 13742 corp: 10/255b lim: 40 exec/s: 0 rss: 66Mb L: 40/40 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:21.530 [2024-12-16 10:55:19.916460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.916486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.916543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.916557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.916616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.916630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.916688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.916703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.530 #25 NEW cov: 11784 ft: 13822 corp: 11/293b lim: 40 exec/s: 0 rss: 66Mb L: 38/40 MS: 1 CrossOver- 00:08:21.530 [2024-12-16 10:55:19.956480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:5454d454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.956505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.956567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.956581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.956659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.956674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.530 #26 NEW cov: 11784 ft: 13861 corp: 12/324b lim: 40 exec/s: 0 rss: 67Mb L: 31/40 MS: 1 ChangeBit- 00:08:21.530 [2024-12-16 10:55:19.996589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.996620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.996679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.996693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:19.996752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:19.996765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.530 #27 NEW cov: 11784 ft: 13925 corp: 13/351b lim: 40 exec/s: 0 rss: 67Mb L: 27/40 MS: 1 EraseBytes- 00:08:21.530 [2024-12-16 10:55:20.036698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:20.036723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:20.036784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:545454ac cdw11:a4545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:20.036798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.530 [2024-12-16 10:55:20.036856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:20.036870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.530 #28 NEW cov: 11784 ft: 13972 corp: 14/382b lim: 40 exec/s: 0 rss: 67Mb L: 31/40 MS: 1 ChangeBinInt- 00:08:21.530 [2024-12-16 10:55:20.077080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.530 [2024-12-16 10:55:20.077105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.077168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.077182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.077242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.077256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.077315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.077329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.077390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.077404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.531 #29 NEW cov: 11784 ft: 14005 corp: 15/422b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:08:21.531 [2024-12-16 10:55:20.127234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.127259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.127336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.127351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.127408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.127422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.127461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:545454ac cdw11:ababa554 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.127481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.531 [2024-12-16 10:55:20.127539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.531 [2024-12-16 10:55:20.127553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.790 #30 NEW cov: 11784 ft: 14072 corp: 16/462b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:21.790 [2024-12-16 10:55:20.177013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545441 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.177039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.177099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5454540a cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.177113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.790 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.790 #31 NEW cov: 11807 ft: 14104 corp: 17/479b lim: 40 exec/s: 0 rss: 67Mb L: 17/40 MS: 1 CrossOver- 00:08:21.790 [2024-12-16 10:55:20.217247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.217271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.217333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54547154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.217346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.217403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54415454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.217416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.790 #32 NEW cov: 11807 ft: 14120 corp: 18/506b lim: 40 exec/s: 0 rss: 67Mb L: 27/40 MS: 1 CrossOver- 00:08:21.790 [2024-12-16 10:55:20.257523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54303030 cdw11:30303054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.257547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.257608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.257626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.257684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.257699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.257757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.257771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.790 #33 NEW cov: 11807 ft: 14188 corp: 19/543b lim: 40 exec/s: 33 rss: 67Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:08:21.790 [2024-12-16 10:55:20.297483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.297508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.297568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54547154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.297582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.297660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54413b54 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.297675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.790 #34 NEW cov: 11807 ft: 14222 corp: 20/570b lim: 40 exec/s: 34 rss: 67Mb L: 27/40 MS: 1 ChangeByte- 00:08:21.790 [2024-12-16 10:55:20.337389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545441 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.337414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.790 #35 NEW cov: 11807 ft: 14609 corp: 21/581b lim: 40 exec/s: 35 rss: 67Mb L: 11/40 MS: 1 EraseBytes- 00:08:21.790 [2024-12-16 10:55:20.377985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:545454fc cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.378009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.378088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.378102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.790 [2024-12-16 10:55:20.378162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.790 [2024-12-16 10:55:20.378176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.790 #36 NEW cov: 11807 ft: 14635 corp: 22/612b lim: 40 exec/s: 36 rss: 67Mb L: 31/40 MS: 1 ChangeByte- 00:08:22.051 [2024-12-16 10:55:20.418101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.418127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.418187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:5454542c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.418201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.418261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.418275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.418333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.418350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.418407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.418421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.051 #37 NEW cov: 11807 ft: 14689 corp: 23/652b lim: 40 exec/s: 37 rss: 67Mb L: 40/40 MS: 1 ChangeByte- 00:08:22.051 [2024-12-16 10:55:20.458083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:545454fc cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.458108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.458169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.458182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.458240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.458253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.458310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54549854 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.458324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.051 #38 NEW cov: 11807 ft: 14720 corp: 24/684b lim: 40 exec/s: 38 rss: 67Mb L: 32/40 MS: 1 InsertByte- 00:08:22.051 [2024-12-16 10:55:20.498359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54efefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.051 [2024-12-16 10:55:20.498384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.051 [2024-12-16 10:55:20.498446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efef5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.498461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.498520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:44545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.498535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.498594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.498607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.498670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:0a545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.498684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.052 #39 NEW cov: 11807 ft: 14730 corp: 25/724b lim: 40 exec/s: 39 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:08:22.052 [2024-12-16 10:55:20.538173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.538197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.538257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.538271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.538328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.538342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.052 #40 NEW cov: 11807 ft: 14743 corp: 26/751b lim: 40 exec/s: 40 rss: 67Mb L: 27/40 MS: 1 CopyPart- 00:08:22.052 [2024-12-16 10:55:20.578439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.578464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.578523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.578537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.578596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.578613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.578688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.578702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.052 #41 NEW cov: 11807 ft: 14751 corp: 27/789b lim: 40 exec/s: 41 rss: 67Mb L: 38/40 MS: 1 ShuffleBytes- 00:08:22.052 [2024-12-16 10:55:20.618198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:0a545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.618223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 #42 NEW cov: 11807 ft: 14853 corp: 28/804b lim: 40 exec/s: 42 rss: 67Mb L: 15/40 MS: 1 CrossOver- 00:08:22.052 [2024-12-16 10:55:20.658835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.658861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.658920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.658934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.658990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.659004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.659080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:545454ac cdw11:ababa50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.659094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.052 [2024-12-16 10:55:20.659155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.052 [2024-12-16 10:55:20.659169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.313 #43 NEW cov: 11807 ft: 14855 corp: 29/844b lim: 40 exec/s: 43 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:08:22.313 [2024-12-16 10:55:20.698700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:545454fc cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.698725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.698787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.698802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.698862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.698876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.313 #44 NEW cov: 11807 ft: 14869 corp: 30/873b lim: 40 exec/s: 44 rss: 68Mb L: 29/40 MS: 1 EraseBytes- 00:08:22.313 [2024-12-16 10:55:20.739060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545455 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.739085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.739146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:5454542c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.739160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.739219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.739232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.739291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.739304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.739363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.739377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.313 #45 NEW cov: 11807 ft: 14877 corp: 31/913b lim: 40 exec/s: 45 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:08:22.313 [2024-12-16 10:55:20.779186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54efefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.779211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.779255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efef5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.779268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.779326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:44545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.779339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.779400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.779414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.779475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:0a545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.779488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.313 #46 NEW cov: 11807 ft: 14932 corp: 32/953b lim: 40 exec/s: 46 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:08:22.313 [2024-12-16 10:55:20.819003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:545454fc cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.819026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.819106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:11545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.819120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.819181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.819194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.313 #47 NEW cov: 11807 ft: 15004 corp: 33/983b lim: 40 exec/s: 47 rss: 68Mb L: 30/40 MS: 1 InsertByte- 00:08:22.313 [2024-12-16 10:55:20.859388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.859413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.859472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:5454542c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.859487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.859545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.859558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.859618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.859631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.859712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.859726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.313 #48 NEW cov: 11807 ft: 15025 corp: 34/1023b lim: 40 exec/s: 48 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:08:22.313 [2024-12-16 10:55:20.899225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545654 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.899250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.899312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.899327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.899385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.899399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.313 #49 NEW cov: 11807 ft: 15050 corp: 35/1054b lim: 40 exec/s: 49 rss: 68Mb L: 31/40 MS: 1 ChangeBit- 00:08:22.313 [2024-12-16 10:55:20.929355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.929380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.929439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.929452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.313 [2024-12-16 10:55:20.929513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:5454abab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.313 [2024-12-16 10:55:20.929528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.574 #50 NEW cov: 11807 ft: 15069 corp: 36/1085b lim: 40 exec/s: 50 rss: 68Mb L: 31/40 MS: 1 ChangeBinInt- 00:08:22.574 [2024-12-16 10:55:20.969712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:20.969737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:20.969799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:20.969813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:20.969873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:20.969887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:20.969947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:20.969970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:20.970026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:20.970040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.574 #51 NEW cov: 11807 ft: 15110 corp: 37/1125b lim: 40 exec/s: 51 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:08:22.574 [2024-12-16 10:55:21.009406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54b6abab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.009432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.009493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ab545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.009506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.574 #52 NEW cov: 11807 ft: 15117 corp: 38/1141b lim: 40 exec/s: 52 rss: 68Mb L: 16/40 MS: 1 ChangeBinInt- 00:08:22.574 [2024-12-16 10:55:21.049827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:5454d454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.049852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.049914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.049928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.049986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:545454cf cdw11:cfcfcf54 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.050000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.050060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.050074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.574 #53 NEW cov: 11807 ft: 15134 corp: 39/1176b lim: 40 exec/s: 53 rss: 68Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:08:22.574 [2024-12-16 10:55:21.089847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545654 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.089872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.089933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545444 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.089948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.090010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.090024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.574 #54 NEW cov: 11807 ft: 15139 corp: 40/1204b lim: 40 exec/s: 54 rss: 68Mb L: 28/40 MS: 1 EraseBytes- 00:08:22.574 [2024-12-16 10:55:21.129983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.130012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.130073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54547154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.130087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.130145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54415454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.130159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.574 #55 NEW cov: 11807 ft: 15149 corp: 41/1231b lim: 40 exec/s: 55 rss: 68Mb L: 27/40 MS: 1 ShuffleBytes- 00:08:22.574 [2024-12-16 10:55:21.170282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54efefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.170306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.170367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efef5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.170381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.170457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:44545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.170472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.170534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.170548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.574 [2024-12-16 10:55:21.170613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:0a545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.574 [2024-12-16 10:55:21.170627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.574 #56 NEW cov: 11807 ft: 15163 corp: 42/1271b lim: 40 exec/s: 56 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:08:22.834 [2024-12-16 10:55:21.210254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:5454d454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.210279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.210339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.210353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.210413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:545454cf cdw11:cfcfcf54 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.210426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.210486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.210519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.834 #57 NEW cov: 11807 ft: 15172 corp: 43/1306b lim: 40 exec/s: 57 rss: 68Mb L: 35/40 MS: 1 ShuffleBytes- 00:08:22.834 [2024-12-16 10:55:21.250372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.250396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.250456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.250469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.250529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545d54 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.250542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.250600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.250619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.290649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.290673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.290735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.290749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.290807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545d54 cdw11:54200054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.290820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.290879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.290893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.834 [2024-12-16 10:55:21.290954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545471 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.834 [2024-12-16 10:55:21.290968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.834 #64 pulse cov: 11807 ft: 15174 corp: 43/1306b lim: 40 exec/s: 32 rss: 68Mb 00:08:22.834 #64 NEW cov: 11807 ft: 15174 corp: 44/1346b lim: 40 exec/s: 32 rss: 68Mb L: 40/40 MS: 2 ChangeBinInt-CMP- DE: " \000"- 00:08:22.834 #64 DONE cov: 11807 ft: 15174 corp: 44/1346b lim: 40 exec/s: 32 rss: 68Mb 00:08:22.834 ###### Recommended dictionary. ###### 00:08:22.834 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:22.834 " \000" # Uses: 0 00:08:22.834 ###### End of recommended dictionary. ###### 00:08:22.834 Done 64 runs in 2 second(s) 00:08:22.834 10:55:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:22.834 10:55:21 -- ../common.sh@72 -- # (( i++ )) 00:08:22.834 10:55:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.834 10:55:21 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:22.834 10:55:21 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:22.834 10:55:21 -- nvmf/run.sh@24 -- # local timen=1 00:08:22.834 10:55:21 -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.834 10:55:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:22.834 10:55:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:22.834 10:55:21 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:22.834 10:55:21 -- nvmf/run.sh@29 -- # port=4411 00:08:22.834 10:55:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:22.834 10:55:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:22.834 10:55:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.834 10:55:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:23.093 [2024-12-16 10:55:21.462536] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:23.093 [2024-12-16 10:55:21.462602] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid655917 ] 00:08:23.093 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.093 [2024-12-16 10:55:21.645324] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.093 [2024-12-16 10:55:21.664678] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.093 [2024-12-16 10:55:21.664799] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.093 [2024-12-16 10:55:21.716063] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.352 [2024-12-16 10:55:21.732386] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:23.352 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.352 INFO: Seed: 3381155560 00:08:23.352 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:23.352 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:23.352 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:23.352 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.352 #2 INITED exec/s: 0 rss: 59Mb 00:08:23.352 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.352 This may also happen if the target rejected all inputs we tried so far 00:08:23.352 [2024-12-16 10:55:21.777729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a8d8d cdw11:8d8d8d8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.352 [2024-12-16 10:55:21.777758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.352 [2024-12-16 10:55:21.777817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.352 [2024-12-16 10:55:21.777830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.611 NEW_FUNC[1/671]: 0x466e58 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:23.611 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.611 #9 NEW cov: 11592 ft: 11593 corp: 2/18b lim: 40 exec/s: 0 rss: 66Mb L: 17/17 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:23.611 [2024-12-16 10:55:22.099294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.611 [2024-12-16 10:55:22.099350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.611 #10 NEW cov: 11705 ft: 13168 corp: 3/29b lim: 40 exec/s: 0 rss: 66Mb L: 11/17 MS: 1 InsertRepeatedBytes- 00:08:23.611 [2024-12-16 10:55:22.160261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.611 [2024-12-16 10:55:22.160291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.611 [2024-12-16 10:55:22.160431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.611 [2024-12-16 10:55:22.160451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.611 [2024-12-16 10:55:22.160591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.611 [2024-12-16 10:55:22.160613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.611 [2024-12-16 10:55:22.160750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff2f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.611 [2024-12-16 10:55:22.160768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.611 #15 NEW cov: 11711 ft: 13644 corp: 4/61b lim: 40 exec/s: 0 rss: 66Mb L: 32/32 MS: 5 ChangeByte-ChangeByte-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:23.611 [2024-12-16 10:55:22.209616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.611 [2024-12-16 10:55:22.209650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.870 #16 NEW cov: 11796 ft: 13942 corp: 5/73b lim: 40 exec/s: 0 rss: 66Mb L: 12/32 MS: 1 InsertByte- 00:08:23.870 [2024-12-16 10:55:22.270011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a8d8d cdw11:3a8d8d8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.270041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.870 [2024-12-16 10:55:22.270185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.270202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.870 #17 NEW cov: 11796 ft: 14022 corp: 6/91b lim: 40 exec/s: 0 rss: 66Mb L: 18/32 MS: 1 InsertByte- 00:08:23.870 [2024-12-16 10:55:22.319887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.319914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.870 #18 NEW cov: 11796 ft: 14062 corp: 7/103b lim: 40 exec/s: 0 rss: 66Mb L: 12/32 MS: 1 ChangeByte- 00:08:23.870 [2024-12-16 10:55:22.370323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.370350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.870 [2024-12-16 10:55:22.370499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3fffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.370519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.870 #19 NEW cov: 11796 ft: 14121 corp: 8/126b lim: 40 exec/s: 0 rss: 66Mb L: 23/32 MS: 1 CrossOver- 00:08:23.870 [2024-12-16 10:55:22.430192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.430220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.870 #20 NEW cov: 11796 ft: 14186 corp: 9/138b lim: 40 exec/s: 0 rss: 66Mb L: 12/32 MS: 1 ChangeByte- 00:08:23.870 [2024-12-16 10:55:22.480419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.870 [2024-12-16 10:55:22.480449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 #21 NEW cov: 11796 ft: 14205 corp: 10/150b lim: 40 exec/s: 0 rss: 66Mb L: 12/32 MS: 1 ChangeBinInt- 00:08:24.129 [2024-12-16 10:55:22.541476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.541504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 [2024-12-16 10:55:22.541644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.541673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.129 [2024-12-16 10:55:22.541805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.541821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.129 [2024-12-16 10:55:22.541963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.541979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.129 #22 NEW cov: 11796 ft: 14230 corp: 11/188b lim: 40 exec/s: 0 rss: 66Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:24.129 [2024-12-16 10:55:22.600812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.600841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 #23 NEW cov: 11796 ft: 14289 corp: 12/199b lim: 40 exec/s: 0 rss: 66Mb L: 11/38 MS: 1 ChangeBinInt- 00:08:24.129 [2024-12-16 10:55:22.651174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000a0a cdw11:8d8d3a8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.651201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 [2024-12-16 10:55:22.651359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.651376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.129 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.129 #24 NEW cov: 11819 ft: 14308 corp: 13/219b lim: 40 exec/s: 0 rss: 67Mb L: 20/38 MS: 1 CMP- DE: "\010\000"- 00:08:24.129 [2024-12-16 10:55:22.711112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0800ff08 cdw11:00ffffc0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.129 [2024-12-16 10:55:22.711146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 #28 NEW cov: 11819 ft: 14326 corp: 14/228b lim: 40 exec/s: 0 rss: 67Mb L: 9/38 MS: 4 CrossOver-ChangeByte-PersAutoDict-PersAutoDict- DE: "\010\000"-"\010\000"- 00:08:24.388 [2024-12-16 10:55:22.761319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.761348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 #29 NEW cov: 11819 ft: 14414 corp: 15/239b lim: 40 exec/s: 29 rss: 67Mb L: 11/38 MS: 1 CrossOver- 00:08:24.388 [2024-12-16 10:55:22.812438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.812466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.812599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.812620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.812759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.812776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.812910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffffff8 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.812928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.388 #30 NEW cov: 11819 ft: 14426 corp: 16/277b lim: 40 exec/s: 30 rss: 67Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:24.388 [2024-12-16 10:55:22.871710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffd cdw11:aa000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.871739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 #31 NEW cov: 11819 ft: 14455 corp: 17/289b lim: 40 exec/s: 31 rss: 67Mb L: 12/38 MS: 1 CMP- DE: "\252\000\000\000"- 00:08:24.388 [2024-12-16 10:55:22.922189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a8d8d cdw11:8d8d3a8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.922220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.922361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.922380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 #32 NEW cov: 11819 ft: 14475 corp: 18/307b lim: 40 exec/s: 32 rss: 67Mb L: 18/38 MS: 1 ShuffleBytes- 00:08:24.388 [2024-12-16 10:55:22.972996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.973025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.973170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.973192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.973339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.388 [2024-12-16 10:55:22.973355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.388 [2024-12-16 10:55:22.973498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffffff8 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.389 [2024-12-16 10:55:22.973515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.389 #33 NEW cov: 11819 ft: 14500 corp: 19/341b lim: 40 exec/s: 33 rss: 67Mb L: 34/38 MS: 1 CrossOver- 00:08:24.648 [2024-12-16 10:55:23.032288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:3bffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.032317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.648 #34 NEW cov: 11819 ft: 14516 corp: 20/353b lim: 40 exec/s: 34 rss: 67Mb L: 12/38 MS: 1 ShuffleBytes- 00:08:24.648 [2024-12-16 10:55:23.082417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.082446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.648 #35 NEW cov: 11819 ft: 14593 corp: 21/364b lim: 40 exec/s: 35 rss: 67Mb L: 11/38 MS: 1 ShuffleBytes- 00:08:24.648 [2024-12-16 10:55:23.133509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.133537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.133683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.133702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.133845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.133864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.134010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffffff8 cdw11:8d8dffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.134028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.648 #36 NEW cov: 11819 ft: 14609 corp: 22/398b lim: 40 exec/s: 36 rss: 67Mb L: 34/38 MS: 1 CrossOver- 00:08:24.648 [2024-12-16 10:55:23.193463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.193490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.193650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f0800ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.193666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.193774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.193801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.648 #37 NEW cov: 11819 ft: 14808 corp: 23/423b lim: 40 exec/s: 37 rss: 67Mb L: 25/38 MS: 1 PersAutoDict- DE: "\010\000"- 00:08:24.648 [2024-12-16 10:55:23.253844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.253873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.254030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.254048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.254140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.254156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.648 [2024-12-16 10:55:23.254296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fffffff8 cdw11:8d8dff8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.648 [2024-12-16 10:55:23.254315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.908 #38 NEW cov: 11819 ft: 14815 corp: 24/457b lim: 40 exec/s: 38 rss: 67Mb L: 34/38 MS: 1 CrossOver- 00:08:24.908 [2024-12-16 10:55:23.314069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:feffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.314097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.908 [2024-12-16 10:55:23.314242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.314259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.908 [2024-12-16 10:55:23.314358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.314375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.908 [2024-12-16 10:55:23.314517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.314535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.908 #39 NEW cov: 11819 ft: 14823 corp: 25/495b lim: 40 exec/s: 39 rss: 67Mb L: 38/38 MS: 1 ChangeBit- 00:08:24.908 [2024-12-16 10:55:23.363224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.363251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.908 #40 NEW cov: 11819 ft: 14859 corp: 26/507b lim: 40 exec/s: 40 rss: 67Mb L: 12/38 MS: 1 InsertByte- 00:08:24.908 [2024-12-16 10:55:23.424063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.424098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.908 [2024-12-16 10:55:23.424242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.424260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.908 [2024-12-16 10:55:23.424397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3bffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.424414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.908 #41 NEW cov: 11819 ft: 14867 corp: 27/538b lim: 40 exec/s: 41 rss: 67Mb L: 31/38 MS: 1 CrossOver- 00:08:24.908 [2024-12-16 10:55:23.473534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.908 [2024-12-16 10:55:23.473562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.908 #42 NEW cov: 11819 ft: 14888 corp: 28/550b lim: 40 exec/s: 42 rss: 68Mb L: 12/38 MS: 1 ChangeBinInt- 00:08:25.167 [2024-12-16 10:55:23.533847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.533879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.168 #43 NEW cov: 11819 ft: 14904 corp: 29/562b lim: 40 exec/s: 43 rss: 68Mb L: 12/38 MS: 1 ChangeBinInt- 00:08:25.168 [2024-12-16 10:55:23.594344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000a0a cdw11:8d8d3a8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.594372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.168 [2024-12-16 10:55:23.594519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:8d8d7372 cdw11:72727272 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.594538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.168 #44 NEW cov: 11819 ft: 14911 corp: 30/582b lim: 40 exec/s: 44 rss: 68Mb L: 20/38 MS: 1 ChangeBinInt- 00:08:25.168 [2024-12-16 10:55:23.644780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff080808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.644808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.168 [2024-12-16 10:55:23.644947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:08ffffff cdw11:3fffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.644962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.168 [2024-12-16 10:55:23.645111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.645131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.168 #45 NEW cov: 11819 ft: 14926 corp: 31/609b lim: 40 exec/s: 45 rss: 68Mb L: 27/38 MS: 1 InsertRepeatedBytes- 00:08:25.168 [2024-12-16 10:55:23.694566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a8d8d cdw11:8d8d3a8d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.694592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.168 [2024-12-16 10:55:23.694734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8daa0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.694759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.168 #46 NEW cov: 11819 ft: 14971 corp: 32/631b lim: 40 exec/s: 46 rss: 68Mb L: 22/38 MS: 1 PersAutoDict- DE: "\252\000\000\000"- 00:08:25.168 [2024-12-16 10:55:23.744492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffffd cdw11:ffff3bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.168 [2024-12-16 10:55:23.744520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.168 #47 NEW cov: 11819 ft: 14978 corp: 33/643b lim: 40 exec/s: 47 rss: 68Mb L: 12/38 MS: 1 ShuffleBytes- 00:08:25.427 [2024-12-16 10:55:23.795293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff080808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.427 [2024-12-16 10:55:23.795323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.427 [2024-12-16 10:55:23.795469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:08ffffff cdw11:3fffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.427 [2024-12-16 10:55:23.795487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.427 [2024-12-16 10:55:23.795650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.427 [2024-12-16 10:55:23.795668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.427 #48 NEW cov: 11819 ft: 14993 corp: 34/670b lim: 40 exec/s: 24 rss: 68Mb L: 27/38 MS: 1 CopyPart- 00:08:25.427 #48 DONE cov: 11819 ft: 14993 corp: 34/670b lim: 40 exec/s: 24 rss: 68Mb 00:08:25.427 ###### Recommended dictionary. ###### 00:08:25.427 "\010\000" # Uses: 3 00:08:25.427 "\252\000\000\000" # Uses: 1 00:08:25.427 ###### End of recommended dictionary. ###### 00:08:25.427 Done 48 runs in 2 second(s) 00:08:25.427 10:55:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:25.427 10:55:23 -- ../common.sh@72 -- # (( i++ )) 00:08:25.427 10:55:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.427 10:55:23 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:25.427 10:55:23 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:25.427 10:55:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.427 10:55:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.428 10:55:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:25.428 10:55:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:25.428 10:55:23 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:25.428 10:55:23 -- nvmf/run.sh@29 -- # port=4412 00:08:25.428 10:55:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:25.428 10:55:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:25.428 10:55:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.428 10:55:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:25.428 [2024-12-16 10:55:23.971476] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:25.428 [2024-12-16 10:55:23.971541] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid656317 ] 00:08:25.428 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.687 [2024-12-16 10:55:24.145948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.687 [2024-12-16 10:55:24.165065] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.687 [2024-12-16 10:55:24.165204] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.687 [2024-12-16 10:55:24.216596] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.687 [2024-12-16 10:55:24.232889] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:25.687 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.687 INFO: Seed: 1587192656 00:08:25.687 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:25.687 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:25.687 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:25.687 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.687 #2 INITED exec/s: 0 rss: 59Mb 00:08:25.687 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.687 This may also happen if the target rejected all inputs we tried so far 00:08:25.687 [2024-12-16 10:55:24.288696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.687 [2024-12-16 10:55:24.288725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.687 [2024-12-16 10:55:24.288786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.687 [2024-12-16 10:55:24.288800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.687 [2024-12-16 10:55:24.288846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.687 [2024-12-16 10:55:24.288860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.687 [2024-12-16 10:55:24.288915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.687 [2024-12-16 10:55:24.288928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.207 NEW_FUNC[1/671]: 0x468bc8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:26.207 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.207 #8 NEW cov: 11590 ft: 11591 corp: 2/34b lim: 40 exec/s: 0 rss: 66Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:26.207 [2024-12-16 10:55:24.609455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.609489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.609565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.609580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.609638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.609652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.609712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.609725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.207 #9 NEW cov: 11703 ft: 12094 corp: 3/67b lim: 40 exec/s: 0 rss: 66Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:26.207 [2024-12-16 10:55:24.659475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.659502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.659561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.659575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.659635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.659649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.659705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.659718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.207 #10 NEW cov: 11709 ft: 12303 corp: 4/104b lim: 40 exec/s: 0 rss: 66Mb L: 37/37 MS: 1 CopyPart- 00:08:26.207 [2024-12-16 10:55:24.699626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.699654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.699733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.699747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.699806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.699819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.699876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.699889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.207 #11 NEW cov: 11794 ft: 12648 corp: 5/143b lim: 40 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 CopyPart- 00:08:26.207 [2024-12-16 10:55:24.739717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.739742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.739802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.207 [2024-12-16 10:55:24.739816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.207 [2024-12-16 10:55:24.739874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.739888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.208 [2024-12-16 10:55:24.739943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.739956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.208 #12 NEW cov: 11794 ft: 12730 corp: 6/177b lim: 40 exec/s: 0 rss: 66Mb L: 34/39 MS: 1 InsertByte- 00:08:26.208 [2024-12-16 10:55:24.779408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a3b0a95 cdw11:9595953b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.779433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.208 #17 NEW cov: 11794 ft: 13657 corp: 7/185b lim: 40 exec/s: 0 rss: 66Mb L: 8/39 MS: 5 InsertByte-ChangeByte-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:26.208 [2024-12-16 10:55:24.820002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.820026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.208 [2024-12-16 10:55:24.820103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.820116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.208 [2024-12-16 10:55:24.820173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.820186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.208 [2024-12-16 10:55:24.820244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.208 [2024-12-16 10:55:24.820257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #18 NEW cov: 11794 ft: 13718 corp: 8/218b lim: 40 exec/s: 0 rss: 66Mb L: 33/39 MS: 1 ChangeBinInt- 00:08:26.468 [2024-12-16 10:55:24.860100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.860125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.860183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.860197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.860270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.860284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.860340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.860353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #19 NEW cov: 11794 ft: 13741 corp: 9/255b lim: 40 exec/s: 0 rss: 66Mb L: 37/39 MS: 1 CopyPart- 00:08:26.468 [2024-12-16 10:55:24.900215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.900239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.900299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.900313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.900349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.900363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.900420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.900433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #20 NEW cov: 11794 ft: 13826 corp: 10/292b lim: 40 exec/s: 0 rss: 66Mb L: 37/39 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:26.468 [2024-12-16 10:55:24.940350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.940374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.940430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.940443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.940487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.940501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.940556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:25000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.940569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #21 NEW cov: 11794 ft: 13882 corp: 11/329b lim: 40 exec/s: 0 rss: 66Mb L: 37/39 MS: 1 ChangeBinInt- 00:08:26.468 [2024-12-16 10:55:24.980498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.980522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.980580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.980594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.980640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.980658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:24.980713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0003ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:24.980726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #22 NEW cov: 11794 ft: 13902 corp: 12/368b lim: 40 exec/s: 0 rss: 66Mb L: 39/39 MS: 1 CrossOver- 00:08:26.468 [2024-12-16 10:55:25.020591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.020618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:25.020690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff01ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.020704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:25.020758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.020770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:25.020825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.020838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #23 NEW cov: 11794 ft: 13948 corp: 13/401b lim: 40 exec/s: 0 rss: 66Mb L: 33/39 MS: 1 ShuffleBytes- 00:08:26.468 [2024-12-16 10:55:25.060721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.060745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:25.060806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffff6ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.060820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:25.060877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.060890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.468 [2024-12-16 10:55:25.060944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.468 [2024-12-16 10:55:25.060956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.468 #24 NEW cov: 11794 ft: 13956 corp: 14/440b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:26.727 [2024-12-16 10:55:25.100528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.727 [2024-12-16 10:55:25.100553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.727 [2024-12-16 10:55:25.100616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.727 [2024-12-16 10:55:25.100634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.727 #25 NEW cov: 11794 ft: 14190 corp: 15/459b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 EraseBytes- 00:08:26.727 [2024-12-16 10:55:25.140980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.141004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.141081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.141096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.141153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.141167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.141226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0003ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.141239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.728 #26 NEW cov: 11794 ft: 14217 corp: 16/497b lim: 40 exec/s: 0 rss: 67Mb L: 38/39 MS: 1 CopyPart- 00:08:26.728 [2024-12-16 10:55:25.181104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.181128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.181186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:f6ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.181200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.181255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.181268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.181324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.181337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.728 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.728 #27 NEW cov: 11817 ft: 14256 corp: 17/536b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:26.728 [2024-12-16 10:55:25.221217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.221241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.221316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.221329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.221386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.221402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.221459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0003ffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.221472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.728 #28 NEW cov: 11817 ft: 14276 corp: 18/575b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:26.728 [2024-12-16 10:55:25.261366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.261390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.261447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.261461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.261517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.261530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.261587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.261600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.728 #29 NEW cov: 11817 ft: 14326 corp: 19/612b lim: 40 exec/s: 29 rss: 67Mb L: 37/39 MS: 1 ChangeBit- 00:08:26.728 [2024-12-16 10:55:25.301482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff3bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.301507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.301566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.301579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.301636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.301649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.301707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.301719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.728 #30 NEW cov: 11817 ft: 14359 corp: 20/650b lim: 40 exec/s: 30 rss: 67Mb L: 38/39 MS: 1 InsertByte- 00:08:26.728 [2024-12-16 10:55:25.341215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.341239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.728 [2024-12-16 10:55:25.341298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.728 [2024-12-16 10:55:25.341315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 #31 NEW cov: 11817 ft: 14390 corp: 21/669b lim: 40 exec/s: 31 rss: 67Mb L: 19/39 MS: 1 CopyPart- 00:08:26.988 [2024-12-16 10:55:25.381747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0aefffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.381771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.381844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.381859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.381914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.381928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.381984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.381997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.988 #32 NEW cov: 11817 ft: 14397 corp: 22/706b lim: 40 exec/s: 32 rss: 67Mb L: 37/39 MS: 1 ChangeBit- 00:08:26.988 [2024-12-16 10:55:25.421784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0adaffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.421809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.421868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.421881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.421952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.421967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.422024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.422037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.988 #33 NEW cov: 11817 ft: 14484 corp: 23/743b lim: 40 exec/s: 33 rss: 67Mb L: 37/39 MS: 1 ChangeByte- 00:08:26.988 [2024-12-16 10:55:25.461942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.461967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.462024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.462038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.462095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.462111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.462167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffdf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.462180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.988 #34 NEW cov: 11817 ft: 14499 corp: 24/782b lim: 40 exec/s: 34 rss: 67Mb L: 39/39 MS: 1 ChangeBit- 00:08:26.988 [2024-12-16 10:55:25.502062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.502086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.502161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fffeffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.502175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.502230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.502243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.502300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:25000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.502313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.988 #35 NEW cov: 11817 ft: 14549 corp: 25/819b lim: 40 exec/s: 35 rss: 67Mb L: 37/39 MS: 1 ChangeBit- 00:08:26.988 [2024-12-16 10:55:25.541862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a3b0a95 cdw11:95959501 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.541886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.541959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000003b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.541973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 #36 NEW cov: 11817 ft: 14565 corp: 26/835b lim: 40 exec/s: 36 rss: 67Mb L: 16/39 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:26.988 [2024-12-16 10:55:25.581994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a3b0a95 cdw11:95959501 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.582019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.988 [2024-12-16 10:55:25.582092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000303b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.988 [2024-12-16 10:55:25.582106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.988 #37 NEW cov: 11817 ft: 14602 corp: 27/851b lim: 40 exec/s: 37 rss: 68Mb L: 16/39 MS: 1 ChangeByte- 00:08:27.248 [2024-12-16 10:55:25.622431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.622459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.622534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:24000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.622548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.622606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.622624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.622682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0003ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.622695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.248 #38 NEW cov: 11817 ft: 14614 corp: 28/890b lim: 40 exec/s: 38 rss: 68Mb L: 39/39 MS: 1 ChangeByte- 00:08:27.248 [2024-12-16 10:55:25.662177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.662202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.662259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff40ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.662271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.248 #39 NEW cov: 11817 ft: 14642 corp: 29/910b lim: 40 exec/s: 39 rss: 68Mb L: 20/39 MS: 1 InsertByte- 00:08:27.248 [2024-12-16 10:55:25.702628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.702652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.702727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0003ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.702741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.702800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.702813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.702870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0003ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.702883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.248 #40 NEW cov: 11817 ft: 14788 corp: 30/948b lim: 40 exec/s: 40 rss: 68Mb L: 38/39 MS: 1 CopyPart- 00:08:27.248 [2024-12-16 10:55:25.742788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.742813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.742889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff01ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.742906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.742963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.742976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.743031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffff2d cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.743044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.248 #41 NEW cov: 11817 ft: 14806 corp: 31/982b lim: 40 exec/s: 41 rss: 68Mb L: 34/39 MS: 1 InsertByte- 00:08:27.248 [2024-12-16 10:55:25.782936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.782961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.783020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.783035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.783090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.783104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.248 [2024-12-16 10:55:25.783162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.783175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.248 #42 NEW cov: 11817 ft: 14824 corp: 32/1015b lim: 40 exec/s: 42 rss: 68Mb L: 33/39 MS: 1 ChangeByte- 00:08:27.248 [2024-12-16 10:55:25.822856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.248 [2024-12-16 10:55:25.822881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.249 [2024-12-16 10:55:25.822941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.249 [2024-12-16 10:55:25.822954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.249 [2024-12-16 10:55:25.823011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff40ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.249 [2024-12-16 10:55:25.823025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.249 #43 NEW cov: 11817 ft: 15030 corp: 33/1043b lim: 40 exec/s: 43 rss: 68Mb L: 28/39 MS: 1 InsertRepeatedBytes- 00:08:27.249 [2024-12-16 10:55:25.863166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affff68 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.249 [2024-12-16 10:55:25.863191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.249 [2024-12-16 10:55:25.863248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.249 [2024-12-16 10:55:25.863265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.249 [2024-12-16 10:55:25.863324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.249 [2024-12-16 10:55:25.863338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.249 [2024-12-16 10:55:25.863395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.249 [2024-12-16 10:55:25.863408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.508 #49 NEW cov: 11817 ft: 15037 corp: 34/1080b lim: 40 exec/s: 49 rss: 68Mb L: 37/39 MS: 1 ChangeByte- 00:08:27.508 [2024-12-16 10:55:25.903401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.903426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:25.903502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff01ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.903516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:25.903573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.903587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:25.903646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:62626262 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.903660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:25.903719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:626262ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.903732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:27.508 #50 NEW cov: 11817 ft: 15092 corp: 35/1120b lim: 40 exec/s: 50 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:27.508 [2024-12-16 10:55:25.943059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.943084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:25.943142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffcff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.943156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.508 #51 NEW cov: 11817 ft: 15112 corp: 36/1139b lim: 40 exec/s: 51 rss: 68Mb L: 19/40 MS: 1 ChangeBinInt- 00:08:27.508 [2024-12-16 10:55:25.982938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0e0e0e cdw11:0e0e0e0e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:25.982963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.508 #53 NEW cov: 11817 ft: 15136 corp: 37/1148b lim: 40 exec/s: 53 rss: 68Mb L: 9/40 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:27.508 [2024-12-16 10:55:26.023560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affff68 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.023585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:26.023647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:b1ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.023661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:26.023720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.023733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:26.023791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:03ffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.023805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.508 #54 NEW cov: 11817 ft: 15168 corp: 38/1185b lim: 40 exec/s: 54 rss: 68Mb L: 37/40 MS: 1 ChangeByte- 00:08:27.508 [2024-12-16 10:55:26.063683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff3bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.063708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:26.063768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffdfffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.063781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.508 [2024-12-16 10:55:26.063838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.508 [2024-12-16 10:55:26.063852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.509 [2024-12-16 10:55:26.063909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.509 [2024-12-16 10:55:26.063922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.509 #55 NEW cov: 11817 ft: 15184 corp: 39/1223b lim: 40 exec/s: 55 rss: 68Mb L: 38/40 MS: 1 ChangeBit- 00:08:27.509 [2024-12-16 10:55:26.103852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.509 [2024-12-16 10:55:26.103877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.509 [2024-12-16 10:55:26.103953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.509 [2024-12-16 10:55:26.103967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.509 [2024-12-16 10:55:26.104034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.509 [2024-12-16 10:55:26.104047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.509 [2024-12-16 10:55:26.104104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.509 [2024-12-16 10:55:26.104119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.509 #56 NEW cov: 11817 ft: 15190 corp: 40/1256b lim: 40 exec/s: 56 rss: 68Mb L: 33/40 MS: 1 ShuffleBytes- 00:08:27.768 [2024-12-16 10:55:26.143732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ff000800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.143757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.143817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.143830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.143888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff40ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.143901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.768 #57 NEW cov: 11817 ft: 15208 corp: 41/1284b lim: 40 exec/s: 57 rss: 68Mb L: 28/40 MS: 1 ChangeBinInt- 00:08:27.768 [2024-12-16 10:55:26.184066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.184091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.184148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.184161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.184218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffd9d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.184231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.184289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d9d9d9ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.184303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.768 #58 NEW cov: 11817 ft: 15219 corp: 42/1323b lim: 40 exec/s: 58 rss: 68Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:27.768 [2024-12-16 10:55:26.223812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ff0effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.223838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.223895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.223909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 #59 NEW cov: 11817 ft: 15261 corp: 43/1343b lim: 40 exec/s: 59 rss: 68Mb L: 20/40 MS: 1 InsertByte- 00:08:27.768 [2024-12-16 10:55:26.264262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.264287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.264349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.768 [2024-12-16 10:55:26.264362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-12-16 10:55:26.264421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.769 [2024-12-16 10:55:26.264450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.769 [2024-12-16 10:55:26.264508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.769 [2024-12-16 10:55:26.264522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.769 #60 NEW cov: 11817 ft: 15262 corp: 44/1377b lim: 40 exec/s: 30 rss: 68Mb L: 34/40 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:27.769 #60 DONE cov: 11817 ft: 15262 corp: 44/1377b lim: 40 exec/s: 30 rss: 68Mb 00:08:27.769 ###### Recommended dictionary. ###### 00:08:27.769 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:27.769 ###### End of recommended dictionary. ###### 00:08:27.769 Done 60 runs in 2 second(s) 00:08:28.028 10:55:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:28.028 10:55:26 -- ../common.sh@72 -- # (( i++ )) 00:08:28.028 10:55:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.028 10:55:26 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:28.028 10:55:26 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:28.028 10:55:26 -- nvmf/run.sh@24 -- # local timen=1 00:08:28.028 10:55:26 -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.028 10:55:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:28.028 10:55:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:28.028 10:55:26 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:28.028 10:55:26 -- nvmf/run.sh@29 -- # port=4413 00:08:28.028 10:55:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:28.028 10:55:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:28.028 10:55:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.028 10:55:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:28.028 [2024-12-16 10:55:26.436088] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:28.028 [2024-12-16 10:55:26.436153] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid656854 ] 00:08:28.028 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.028 [2024-12-16 10:55:26.610738] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.028 [2024-12-16 10:55:26.630093] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.029 [2024-12-16 10:55:26.630218] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.288 [2024-12-16 10:55:26.681761] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.288 [2024-12-16 10:55:26.698083] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:28.288 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.288 INFO: Seed: 4052189056 00:08:28.288 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:28.288 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:28.288 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:28.288 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.288 #2 INITED exec/s: 0 rss: 59Mb 00:08:28.288 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.288 This may also happen if the target rejected all inputs we tried so far 00:08:28.288 [2024-12-16 10:55:26.753687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.288 [2024-12-16 10:55:26.753715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.288 [2024-12-16 10:55:26.753789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.288 [2024-12-16 10:55:26.753804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.288 [2024-12-16 10:55:26.753861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.288 [2024-12-16 10:55:26.753875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.288 [2024-12-16 10:55:26.753929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.288 [2024-12-16 10:55:26.753942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.548 NEW_FUNC[1/670]: 0x46a798 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:28.548 NEW_FUNC[2/670]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.548 #4 NEW cov: 11578 ft: 11573 corp: 2/40b lim: 40 exec/s: 0 rss: 66Mb L: 39/39 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:28.548 [2024-12-16 10:55:27.064036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.548 [2024-12-16 10:55:27.064069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.548 #5 NEW cov: 11691 ft: 12639 corp: 3/51b lim: 40 exec/s: 0 rss: 66Mb L: 11/39 MS: 1 CrossOver- 00:08:28.548 [2024-12-16 10:55:27.114144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffff7fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.548 [2024-12-16 10:55:27.114170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.548 #6 NEW cov: 11697 ft: 12908 corp: 4/62b lim: 40 exec/s: 0 rss: 66Mb L: 11/39 MS: 1 ChangeBit- 00:08:28.548 [2024-12-16 10:55:27.154564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.548 [2024-12-16 10:55:27.154591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.548 [2024-12-16 10:55:27.154652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.548 [2024-12-16 10:55:27.154666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.548 [2024-12-16 10:55:27.154722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.548 [2024-12-16 10:55:27.154735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.548 [2024-12-16 10:55:27.154795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.548 [2024-12-16 10:55:27.154808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.818 #13 NEW cov: 11782 ft: 13245 corp: 5/101b lim: 40 exec/s: 0 rss: 66Mb L: 39/39 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:28.819 [2024-12-16 10:55:27.194431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.194455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.819 [2024-12-16 10:55:27.194526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.194539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.819 #14 NEW cov: 11782 ft: 13581 corp: 6/123b lim: 40 exec/s: 0 rss: 66Mb L: 22/39 MS: 1 InsertRepeatedBytes- 00:08:28.819 [2024-12-16 10:55:27.234550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.234576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.819 [2024-12-16 10:55:27.234636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.234650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.819 #15 NEW cov: 11782 ft: 13643 corp: 7/145b lim: 40 exec/s: 0 rss: 66Mb L: 22/39 MS: 1 ChangeBinInt- 00:08:28.819 [2024-12-16 10:55:27.274924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.274950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.819 [2024-12-16 10:55:27.275008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.275022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.819 [2024-12-16 10:55:27.275077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.275090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.819 [2024-12-16 10:55:27.275145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.819 [2024-12-16 10:55:27.275158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.819 #16 NEW cov: 11782 ft: 13693 corp: 8/184b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 CopyPart- 00:08:28.819 [2024-12-16 10:55:27.315037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b8 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.315063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.820 [2024-12-16 10:55:27.315122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.315139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.820 [2024-12-16 10:55:27.315207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.315222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.820 [2024-12-16 10:55:27.315278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.315291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.820 #17 NEW cov: 11782 ft: 13729 corp: 9/223b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBit- 00:08:28.820 [2024-12-16 10:55:27.354913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffff8dff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.354938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.820 [2024-12-16 10:55:27.355011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.355025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.820 #18 NEW cov: 11782 ft: 13850 corp: 10/246b lim: 40 exec/s: 0 rss: 67Mb L: 23/39 MS: 1 InsertByte- 00:08:28.820 [2024-12-16 10:55:27.394900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000041 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.394924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.820 #19 NEW cov: 11782 ft: 13928 corp: 11/257b lim: 40 exec/s: 0 rss: 67Mb L: 11/39 MS: 1 ChangeByte- 00:08:28.820 [2024-12-16 10:55:27.435156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00010000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.435181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.820 [2024-12-16 10:55:27.435239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.820 [2024-12-16 10:55:27.435253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.081 #20 NEW cov: 11782 ft: 14015 corp: 12/279b lim: 40 exec/s: 0 rss: 67Mb L: 22/39 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:29.081 [2024-12-16 10:55:27.475115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.081 [2024-12-16 10:55:27.475140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.081 #21 NEW cov: 11782 ft: 14036 corp: 13/290b lim: 40 exec/s: 0 rss: 67Mb L: 11/39 MS: 1 CMP- DE: "\377\004S*\315\300\201\222"- 00:08:29.081 [2024-12-16 10:55:27.515622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b04d cdw11:4f4f4f4f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.081 [2024-12-16 10:55:27.515647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.081 [2024-12-16 10:55:27.515720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:4f4f4fb0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.081 [2024-12-16 10:55:27.515738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.081 [2024-12-16 10:55:27.515793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.081 [2024-12-16 10:55:27.515806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.081 [2024-12-16 10:55:27.515863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.081 [2024-12-16 10:55:27.515876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.081 #22 NEW cov: 11782 ft: 14052 corp: 14/329b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:29.082 [2024-12-16 10:55:27.555483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.555508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.555567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.555581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.082 #23 NEW cov: 11782 ft: 14085 corp: 15/351b lim: 40 exec/s: 0 rss: 67Mb L: 22/39 MS: 1 ShuffleBytes- 00:08:29.082 [2024-12-16 10:55:27.595710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00010000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.595735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.595791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.595805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.595878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff7fff16 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.595892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.082 #24 NEW cov: 11782 ft: 14319 corp: 16/377b lim: 40 exec/s: 0 rss: 67Mb L: 26/39 MS: 1 InsertRepeatedBytes- 00:08:29.082 [2024-12-16 10:55:27.635951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b04d cdw11:4f4f4f4f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.635977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.636035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:4f4fb0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.636048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.636104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b04fb0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.636117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.636177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.636190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.082 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.082 #25 NEW cov: 11805 ft: 14354 corp: 17/416b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:29.082 [2024-12-16 10:55:27.676079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b8 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.676104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.676160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.676174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.676229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.676241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.082 [2024-12-16 10:55:27.676297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.082 [2024-12-16 10:55:27.676310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.082 #26 NEW cov: 11805 ft: 14391 corp: 18/455b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 CopyPart- 00:08:29.341 [2024-12-16 10:55:27.715812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffff7fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.341 [2024-12-16 10:55:27.715836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.341 #27 NEW cov: 11805 ft: 14425 corp: 19/466b lim: 40 exec/s: 27 rss: 67Mb L: 11/39 MS: 1 EraseBytes- 00:08:29.341 [2024-12-16 10:55:27.756034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0affff7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.341 [2024-12-16 10:55:27.756059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.341 [2024-12-16 10:55:27.756115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff000aff cdw11:ff7fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.341 [2024-12-16 10:55:27.756128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.341 #28 NEW cov: 11805 ft: 14434 corp: 20/484b lim: 40 exec/s: 28 rss: 67Mb L: 18/39 MS: 1 CopyPart- 00:08:29.342 [2024-12-16 10:55:27.796032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.796057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.342 #29 NEW cov: 11805 ft: 14452 corp: 21/495b lim: 40 exec/s: 29 rss: 67Mb L: 11/39 MS: 1 CrossOver- 00:08:29.342 [2024-12-16 10:55:27.836502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b04d cdw11:4f4f4f4f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.836527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.342 [2024-12-16 10:55:27.836588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:4f4fb0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.836601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.342 [2024-12-16 10:55:27.836662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b04fb0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.836675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.342 [2024-12-16 10:55:27.836729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.836742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.342 #30 NEW cov: 11805 ft: 14479 corp: 22/534b lim: 40 exec/s: 30 rss: 67Mb L: 39/39 MS: 1 ChangeByte- 00:08:29.342 [2024-12-16 10:55:27.876292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.876318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.342 #31 NEW cov: 11805 ft: 14506 corp: 23/545b lim: 40 exec/s: 31 rss: 67Mb L: 11/39 MS: 1 ChangeByte- 00:08:29.342 [2024-12-16 10:55:27.916516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.916540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.342 [2024-12-16 10:55:27.916597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:4a00ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.916618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.342 #35 NEW cov: 11805 ft: 14513 corp: 24/563b lim: 40 exec/s: 35 rss: 67Mb L: 18/39 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:29.342 [2024-12-16 10:55:27.956626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.956666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.342 [2024-12-16 10:55:27.956726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff04 cdw11:532acdc0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.342 [2024-12-16 10:55:27.956740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.601 #36 NEW cov: 11805 ft: 14533 corp: 25/582b lim: 40 exec/s: 36 rss: 67Mb L: 19/39 MS: 1 PersAutoDict- DE: "\377\004S*\315\300\201\222"- 00:08:29.601 [2024-12-16 10:55:27.996739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0affff7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.601 [2024-12-16 10:55:27.996764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.601 [2024-12-16 10:55:27.996821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff000aff cdw11:ff48ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.601 [2024-12-16 10:55:27.996834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.601 #37 NEW cov: 11805 ft: 14547 corp: 26/600b lim: 40 exec/s: 37 rss: 68Mb L: 18/39 MS: 1 ChangeByte- 00:08:29.601 [2024-12-16 10:55:28.037111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b8 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.037136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.037192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.037205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.037262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.037275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.037329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.037342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.602 #38 NEW cov: 11805 ft: 14554 corp: 27/639b lim: 40 exec/s: 38 rss: 68Mb L: 39/39 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:29.602 [2024-12-16 10:55:28.077244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.077268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.077324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffbffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.077337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.077394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.077407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.077463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.077475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.602 #39 NEW cov: 11805 ft: 14560 corp: 28/678b lim: 40 exec/s: 39 rss: 68Mb L: 39/39 MS: 1 ChangeBit- 00:08:29.602 [2024-12-16 10:55:28.117357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00b0b0b0 cdw11:b8020000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.117381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.117437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.117451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.117506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:00000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.117519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.117581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff7fffff cdw11:ffb0ffb0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.117594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.602 #40 NEW cov: 11805 ft: 14574 corp: 29/711b lim: 40 exec/s: 40 rss: 68Mb L: 33/39 MS: 1 CrossOver- 00:08:29.602 [2024-12-16 10:55:28.157088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.157112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.602 #41 NEW cov: 11805 ft: 14589 corp: 30/726b lim: 40 exec/s: 41 rss: 68Mb L: 15/39 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:29.602 [2024-12-16 10:55:28.197294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.197318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.602 [2024-12-16 10:55:28.197376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff53 cdw11:042acdc0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.602 [2024-12-16 10:55:28.197390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.602 #42 NEW cov: 11805 ft: 14597 corp: 31/745b lim: 40 exec/s: 42 rss: 68Mb L: 19/39 MS: 1 ShuffleBytes- 00:08:29.862 [2024-12-16 10:55:28.237323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.237348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.862 #43 NEW cov: 11805 ft: 14640 corp: 32/755b lim: 40 exec/s: 43 rss: 68Mb L: 10/39 MS: 1 EraseBytes- 00:08:29.862 [2024-12-16 10:55:28.277824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00b0b0b0 cdw11:b8020000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.277848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.277905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.277918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.277976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:00000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.277989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.278045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff7fffff cdw11:fbb0ffb0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.278058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.862 #44 NEW cov: 11805 ft: 14646 corp: 33/788b lim: 40 exec/s: 44 rss: 68Mb L: 33/39 MS: 1 ChangeBit- 00:08:29.862 [2024-12-16 10:55:28.317833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.317858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.317919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:5757cdc0 cdw11:8192ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.317933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.317990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff53042a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.318003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.862 #45 NEW cov: 11805 ft: 14667 corp: 34/813b lim: 40 exec/s: 45 rss: 68Mb L: 25/39 MS: 1 InsertRepeatedBytes- 00:08:29.862 [2024-12-16 10:55:28.357822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:ffff00ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.357846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.357905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.357918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.862 #46 NEW cov: 11805 ft: 14669 corp: 35/835b lim: 40 exec/s: 46 rss: 68Mb L: 22/39 MS: 1 ShuffleBytes- 00:08:29.862 [2024-12-16 10:55:28.388018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b0 cdw11:4fb0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.388042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.388116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.388129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.388187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.388201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.862 #47 NEW cov: 11805 ft: 14702 corp: 36/861b lim: 40 exec/s: 47 rss: 68Mb L: 26/39 MS: 1 EraseBytes- 00:08:29.862 [2024-12-16 10:55:28.428312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000100ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.428336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.428395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.428408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.428467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.428481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.862 [2024-12-16 10:55:28.428535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffff7fff cdw11:16ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.428548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.862 #48 NEW cov: 11805 ft: 14710 corp: 37/896b lim: 40 exec/s: 48 rss: 68Mb L: 35/39 MS: 1 CopyPart- 00:08:29.862 [2024-12-16 10:55:28.468029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc08192 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.862 [2024-12-16 10:55:28.468053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 #49 NEW cov: 11805 ft: 14727 corp: 38/907b lim: 40 exec/s: 49 rss: 68Mb L: 11/39 MS: 1 CrossOver- 00:08:30.122 [2024-12-16 10:55:28.508528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00010000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.508552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.508614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.508628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.508686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff7fff16 cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.508699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.508755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.508768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.122 #50 NEW cov: 11805 ft: 14728 corp: 39/943b lim: 40 exec/s: 50 rss: 68Mb L: 36/39 MS: 1 CopyPart- 00:08:30.122 [2024-12-16 10:55:28.548377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc05b81 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.548402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.548459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:925b0100 cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.548473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.122 #51 NEW cov: 11805 ft: 14764 corp: 40/959b lim: 40 exec/s: 51 rss: 68Mb L: 16/39 MS: 1 InsertByte- 00:08:30.122 [2024-12-16 10:55:28.588871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0b0b0b0 cdw11:b02ab0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.588895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.588954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.588968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.589024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.589037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.589092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.589108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.589164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:b0b0b0b0 cdw11:b0b0510a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.589177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:30.122 #52 NEW cov: 11805 ft: 14828 corp: 41/999b lim: 40 exec/s: 52 rss: 68Mb L: 40/40 MS: 1 InsertByte- 00:08:30.122 [2024-12-16 10:55:28.628619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff04532a cdw11:cdc05b81 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.628643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 [2024-12-16 10:55:28.628701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:925b0100 cdw11:000028ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.628714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.122 #53 NEW cov: 11805 ft: 14835 corp: 42/1016b lim: 40 exec/s: 53 rss: 68Mb L: 17/40 MS: 1 InsertByte- 00:08:30.122 [2024-12-16 10:55:28.668599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b0ff4b53 cdw11:2ab0b020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.668630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 #56 NEW cov: 11805 ft: 14889 corp: 43/1027b lim: 40 exec/s: 56 rss: 68Mb L: 11/40 MS: 3 CrossOver-ChangeByte-CMP- DE: " \000\000\000"- 00:08:30.122 [2024-12-16 10:55:28.708684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:48ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.122 [2024-12-16 10:55:28.708708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.122 #57 NEW cov: 11805 ft: 14899 corp: 44/1036b lim: 40 exec/s: 28 rss: 69Mb L: 9/40 MS: 1 EraseBytes- 00:08:30.122 #57 DONE cov: 11805 ft: 14899 corp: 44/1036b lim: 40 exec/s: 28 rss: 69Mb 00:08:30.122 ###### Recommended dictionary. ###### 00:08:30.122 "\001\000\000\000" # Uses: 1 00:08:30.122 "\377\004S*\315\300\201\222" # Uses: 1 00:08:30.122 "\002\000\000\000\000\000\000\000" # Uses: 0 00:08:30.122 " \000\000\000" # Uses: 0 00:08:30.123 ###### End of recommended dictionary. ###### 00:08:30.123 Done 57 runs in 2 second(s) 00:08:30.382 10:55:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:30.382 10:55:28 -- ../common.sh@72 -- # (( i++ )) 00:08:30.382 10:55:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.382 10:55:28 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:30.383 10:55:28 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:30.383 10:55:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.383 10:55:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.383 10:55:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:30.383 10:55:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:30.383 10:55:28 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:30.383 10:55:28 -- nvmf/run.sh@29 -- # port=4414 00:08:30.383 10:55:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:30.383 10:55:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:30.383 10:55:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.383 10:55:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:30.383 [2024-12-16 10:55:28.883821] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:30.383 [2024-12-16 10:55:28.883888] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657165 ] 00:08:30.383 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.642 [2024-12-16 10:55:29.059521] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.642 [2024-12-16 10:55:29.079068] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.642 [2024-12-16 10:55:29.079192] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.642 [2024-12-16 10:55:29.130574] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.642 [2024-12-16 10:55:29.146930] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:30.642 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.642 INFO: Seed: 2206207872 00:08:30.642 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:30.642 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:30.642 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:30.642 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.642 #2 INITED exec/s: 0 rss: 59Mb 00:08:30.642 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.642 This may also happen if the target rejected all inputs we tried so far 00:08:30.642 [2024-12-16 10:55:29.222999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.642 [2024-12-16 10:55:29.223035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.901 NEW_FUNC[1/671]: 0x46c368 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:30.901 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:30.901 #4 NEW cov: 11572 ft: 11573 corp: 2/8b lim: 35 exec/s: 0 rss: 66Mb L: 7/7 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:31.161 [2024-12-16 10:55:29.544923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.544979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.161 [2024-12-16 10:55:29.545162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.545189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.161 NEW_FUNC[1/2]: 0x48d798 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:31.161 NEW_FUNC[2/2]: 0x1151748 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:08:31.161 #5 NEW cov: 11718 ft: 12799 corp: 3/31b lim: 35 exec/s: 0 rss: 66Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:31.161 [2024-12-16 10:55:29.604291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.604328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.161 [2024-12-16 10:55:29.604471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.604491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.161 #6 NEW cov: 11731 ft: 13161 corp: 4/50b lim: 35 exec/s: 0 rss: 66Mb L: 19/23 MS: 1 InsertRepeatedBytes- 00:08:31.161 [2024-12-16 10:55:29.654140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.654168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.161 #7 NEW cov: 11816 ft: 13514 corp: 5/57b lim: 35 exec/s: 0 rss: 66Mb L: 7/23 MS: 1 CrossOver- 00:08:31.161 [2024-12-16 10:55:29.714606] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.714643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.161 [2024-12-16 10:55:29.714783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.714811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.161 #8 NEW cov: 11816 ft: 13575 corp: 6/71b lim: 35 exec/s: 0 rss: 66Mb L: 14/23 MS: 1 EraseBytes- 00:08:31.161 [2024-12-16 10:55:29.774530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.161 [2024-12-16 10:55:29.774557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.421 #9 NEW cov: 11816 ft: 13629 corp: 7/78b lim: 35 exec/s: 0 rss: 66Mb L: 7/23 MS: 1 ChangeBinInt- 00:08:31.421 [2024-12-16 10:55:29.824580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.421 [2024-12-16 10:55:29.824607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.421 #10 NEW cov: 11816 ft: 13670 corp: 8/85b lim: 35 exec/s: 0 rss: 66Mb L: 7/23 MS: 1 ChangeByte- 00:08:31.421 [2024-12-16 10:55:29.874863] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.421 [2024-12-16 10:55:29.874893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.421 #11 NEW cov: 11816 ft: 13717 corp: 9/95b lim: 35 exec/s: 0 rss: 66Mb L: 10/23 MS: 1 CrossOver- 00:08:31.421 [2024-12-16 10:55:29.924987] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.421 [2024-12-16 10:55:29.925017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.421 #12 NEW cov: 11816 ft: 13751 corp: 10/106b lim: 35 exec/s: 0 rss: 66Mb L: 11/23 MS: 1 InsertRepeatedBytes- 00:08:31.421 [2024-12-16 10:55:29.975227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.421 [2024-12-16 10:55:29.975256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.421 #13 NEW cov: 11816 ft: 13769 corp: 11/113b lim: 35 exec/s: 0 rss: 67Mb L: 7/23 MS: 1 CrossOver- 00:08:31.681 #17 NEW cov: 11816 ft: 13780 corp: 12/120b lim: 35 exec/s: 0 rss: 67Mb L: 7/23 MS: 4 CrossOver-ChangeByte-InsertByte-InsertByte- 00:08:31.681 [2024-12-16 10:55:30.086199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.681 [2024-12-16 10:55:30.086237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.681 [2024-12-16 10:55:30.086392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.681 [2024-12-16 10:55:30.086412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.681 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.681 #18 NEW cov: 11839 ft: 13905 corp: 13/143b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 ChangeBit- 00:08:31.681 [2024-12-16 10:55:30.145782] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.681 [2024-12-16 10:55:30.145817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.681 #19 NEW cov: 11839 ft: 13918 corp: 14/151b lim: 35 exec/s: 0 rss: 67Mb L: 8/23 MS: 1 InsertByte- 00:08:31.681 [2024-12-16 10:55:30.195894] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.681 [2024-12-16 10:55:30.195924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.681 #20 NEW cov: 11839 ft: 13922 corp: 15/158b lim: 35 exec/s: 20 rss: 67Mb L: 7/23 MS: 1 ShuffleBytes- 00:08:31.681 [2024-12-16 10:55:30.246389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.681 [2024-12-16 10:55:30.246419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.681 [2024-12-16 10:55:30.246572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.681 [2024-12-16 10:55:30.246590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.681 #21 NEW cov: 11839 ft: 14017 corp: 16/172b lim: 35 exec/s: 21 rss: 67Mb L: 14/23 MS: 1 CMP- DE: "\001\000\000\037"- 00:08:31.940 NEW_FUNC[1/2]: 0x486c28 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:31.940 NEW_FUNC[2/2]: 0x1149dc8 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1489 00:08:31.940 #22 NEW cov: 11896 ft: 14098 corp: 17/183b lim: 35 exec/s: 22 rss: 67Mb L: 11/23 MS: 1 PersAutoDict- DE: "\001\000\000\037"- 00:08:31.940 [2024-12-16 10:55:30.356421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.940 [2024-12-16 10:55:30.356451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.940 #23 NEW cov: 11896 ft: 14115 corp: 18/190b lim: 35 exec/s: 23 rss: 67Mb L: 7/23 MS: 1 CrossOver- 00:08:31.940 #24 NEW cov: 11896 ft: 14146 corp: 19/201b lim: 35 exec/s: 24 rss: 67Mb L: 11/23 MS: 1 ChangeByte- 00:08:31.940 [2024-12-16 10:55:30.476930] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.940 [2024-12-16 10:55:30.476957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.940 #25 NEW cov: 11896 ft: 14178 corp: 20/213b lim: 35 exec/s: 25 rss: 67Mb L: 12/23 MS: 1 CopyPart- 00:08:31.940 [2024-12-16 10:55:30.527373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.940 [2024-12-16 10:55:30.527400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.940 [2024-12-16 10:55:30.527542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.940 [2024-12-16 10:55:30.527560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.940 #26 NEW cov: 11896 ft: 14212 corp: 21/231b lim: 35 exec/s: 26 rss: 67Mb L: 18/23 MS: 1 InsertRepeatedBytes- 00:08:32.200 [2024-12-16 10:55:30.577190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.577221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.200 #27 NEW cov: 11896 ft: 14243 corp: 22/242b lim: 35 exec/s: 27 rss: 67Mb L: 11/23 MS: 1 CrossOver- 00:08:32.200 [2024-12-16 10:55:30.627395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.627426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.200 #28 NEW cov: 11896 ft: 14253 corp: 23/249b lim: 35 exec/s: 28 rss: 67Mb L: 7/23 MS: 1 ShuffleBytes- 00:08:32.200 [2024-12-16 10:55:30.677512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT VECTOR CONFIGURATION cid:4 cdw10:00000009 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.677541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.200 NEW_FUNC[1/1]: 0x48cb18 in feat_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:332 00:08:32.200 #29 NEW cov: 11919 ft: 14292 corp: 24/256b lim: 35 exec/s: 29 rss: 67Mb L: 7/23 MS: 1 CMP- DE: "\011\000\000\000"- 00:08:32.200 [2024-12-16 10:55:30.727716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.727746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.200 #30 NEW cov: 11919 ft: 14303 corp: 25/268b lim: 35 exec/s: 30 rss: 67Mb L: 12/23 MS: 1 ShuffleBytes- 00:08:32.200 [2024-12-16 10:55:30.788474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT VECTOR CONFIGURATION cid:4 cdw10:00000009 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.788502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.200 [2024-12-16 10:55:30.788647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.788664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.200 [2024-12-16 10:55:30.788807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.200 [2024-12-16 10:55:30.788824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.200 #31 NEW cov: 11919 ft: 14432 corp: 26/295b lim: 35 exec/s: 31 rss: 67Mb L: 27/27 MS: 1 PersAutoDict- DE: "\011\000\000\000"- 00:08:32.476 [2024-12-16 10:55:30.837996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.476 [2024-12-16 10:55:30.838024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.476 #32 NEW cov: 11919 ft: 14459 corp: 27/302b lim: 35 exec/s: 32 rss: 67Mb L: 7/27 MS: 1 ShuffleBytes- 00:08:32.476 [2024-12-16 10:55:30.888193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.476 [2024-12-16 10:55:30.888220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.476 #33 NEW cov: 11919 ft: 14477 corp: 28/314b lim: 35 exec/s: 33 rss: 68Mb L: 12/27 MS: 1 EraseBytes- 00:08:32.477 [2024-12-16 10:55:30.938370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.477 [2024-12-16 10:55:30.938397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.477 #34 NEW cov: 11919 ft: 14500 corp: 29/326b lim: 35 exec/s: 34 rss: 68Mb L: 12/27 MS: 1 ChangeByte- 00:08:32.477 [2024-12-16 10:55:30.988464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.477 [2024-12-16 10:55:30.988494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.477 #35 NEW cov: 11919 ft: 14514 corp: 30/333b lim: 35 exec/s: 35 rss: 68Mb L: 7/27 MS: 1 CopyPart- 00:08:32.477 [2024-12-16 10:55:31.038643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.477 [2024-12-16 10:55:31.038672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.477 #36 NEW cov: 11919 ft: 14521 corp: 31/341b lim: 35 exec/s: 36 rss: 68Mb L: 8/27 MS: 1 InsertByte- 00:08:32.831 [2024-12-16 10:55:31.088961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.088991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.831 #37 NEW cov: 11919 ft: 14538 corp: 32/353b lim: 35 exec/s: 37 rss: 68Mb L: 12/27 MS: 1 ChangeBinInt- 00:08:32.831 [2024-12-16 10:55:31.150209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.150243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.831 [2024-12-16 10:55:31.150340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.150360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.831 [2024-12-16 10:55:31.150507] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.150540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.831 [2024-12-16 10:55:31.150679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.150703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.831 [2024-12-16 10:55:31.150842] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.150870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:32.831 #38 NEW cov: 11919 ft: 14911 corp: 33/388b lim: 35 exec/s: 38 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:32.831 [2024-12-16 10:55:31.199205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT VECTOR CONFIGURATION cid:4 cdw10:00000009 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.831 [2024-12-16 10:55:31.199234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.831 #39 NEW cov: 11919 ft: 14924 corp: 34/401b lim: 35 exec/s: 19 rss: 68Mb L: 13/35 MS: 1 CopyPart- 00:08:32.831 #39 DONE cov: 11919 ft: 14924 corp: 34/401b lim: 35 exec/s: 19 rss: 68Mb 00:08:32.831 ###### Recommended dictionary. ###### 00:08:32.831 "\001\000\000\037" # Uses: 1 00:08:32.831 "\011\000\000\000" # Uses: 1 00:08:32.831 ###### End of recommended dictionary. ###### 00:08:32.831 Done 39 runs in 2 second(s) 00:08:32.831 10:55:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:32.831 10:55:31 -- ../common.sh@72 -- # (( i++ )) 00:08:32.831 10:55:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.831 10:55:31 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:32.831 10:55:31 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:32.831 10:55:31 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.831 10:55:31 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.831 10:55:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:32.831 10:55:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:32.831 10:55:31 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:32.831 10:55:31 -- nvmf/run.sh@29 -- # port=4415 00:08:32.831 10:55:31 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:32.831 10:55:31 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:32.831 10:55:31 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.831 10:55:31 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:32.831 [2024-12-16 10:55:31.372707] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:32.831 [2024-12-16 10:55:31.372773] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid657690 ] 00:08:32.831 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.142 [2024-12-16 10:55:31.545827] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.142 [2024-12-16 10:55:31.565588] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:33.142 [2024-12-16 10:55:31.565734] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.142 [2024-12-16 10:55:31.617233] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.142 [2024-12-16 10:55:31.633556] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:33.142 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.142 INFO: Seed: 394302483 00:08:33.142 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:33.142 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:33.142 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:33.142 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.142 #2 INITED exec/s: 0 rss: 60Mb 00:08:33.142 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.142 This may also happen if the target rejected all inputs we tried so far 00:08:33.142 [2024-12-16 10:55:31.681320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.142 [2024-12-16 10:55:31.681348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.142 [2024-12-16 10:55:31.681404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.142 [2024-12-16 10:55:31.681418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.142 [2024-12-16 10:55:31.681474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.142 [2024-12-16 10:55:31.681487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.142 [2024-12-16 10:55:31.681541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.142 [2024-12-16 10:55:31.681553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.424 NEW_FUNC[1/669]: 0x46d8a8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:33.424 NEW_FUNC[2/669]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.424 #18 NEW cov: 11541 ft: 11556 corp: 2/35b lim: 35 exec/s: 0 rss: 66Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:33.424 [2024-12-16 10:55:31.982290] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:31.982322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:31.982385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:31.982400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:31.982459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:31.982473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:31.982528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:31.982541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:31.982602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:31.982620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.424 NEW_FUNC[1/1]: 0x1cc7848 in thread_execute_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:934 00:08:33.424 #19 NEW cov: 11673 ft: 12004 corp: 3/70b lim: 35 exec/s: 0 rss: 66Mb L: 35/35 MS: 1 InsertByte- 00:08:33.424 [2024-12-16 10:55:32.032255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:32.032282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:32.032344] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:32.032358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:32.032419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:32.032431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.424 [2024-12-16 10:55:32.032489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.424 [2024-12-16 10:55:32.032502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.684 #20 NEW cov: 11679 ft: 12345 corp: 4/104b lim: 35 exec/s: 0 rss: 66Mb L: 34/35 MS: 1 CrossOver- 00:08:33.684 [2024-12-16 10:55:32.072472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.072499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.072561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.072578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.072640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.072653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.072732] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.072745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.072805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.072819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.684 #26 NEW cov: 11764 ft: 12546 corp: 5/139b lim: 35 exec/s: 0 rss: 66Mb L: 35/35 MS: 1 CopyPart- 00:08:33.684 [2024-12-16 10:55:32.112613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.112638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.112701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.112715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.112793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.112806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.112867] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.112881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.112941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.112954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.684 #27 NEW cov: 11764 ft: 12686 corp: 6/174b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CopyPart- 00:08:33.684 [2024-12-16 10:55:32.152691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.152717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.152795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.152810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.152871] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.152885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.684 NEW_FUNC[1/1]: 0x48d798 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:33.684 #31 NEW cov: 11778 ft: 12922 corp: 7/202b lim: 35 exec/s: 0 rss: 67Mb L: 28/35 MS: 4 ShuffleBytes-ShuffleBytes-ShuffleBytes-CrossOver- 00:08:33.684 [2024-12-16 10:55:32.192866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.192892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.192966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.192981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.193042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.193055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.193117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.193130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.193191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.193205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.684 #32 NEW cov: 11778 ft: 12977 corp: 8/237b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeByte- 00:08:33.684 [2024-12-16 10:55:32.232990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.233015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.233091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.233105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.233167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.233180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.233241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.233255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.684 [2024-12-16 10:55:32.233315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.684 [2024-12-16 10:55:32.233328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.684 #33 NEW cov: 11778 ft: 13013 corp: 9/272b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeBit- 00:08:33.685 [2024-12-16 10:55:32.272958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.685 [2024-12-16 10:55:32.272983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.685 [2024-12-16 10:55:32.273059] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.685 [2024-12-16 10:55:32.273073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.685 [2024-12-16 10:55:32.273138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.685 [2024-12-16 10:55:32.273151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.685 [2024-12-16 10:55:32.273209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.685 [2024-12-16 10:55:32.273223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.685 #34 NEW cov: 11778 ft: 13123 corp: 10/300b lim: 35 exec/s: 0 rss: 67Mb L: 28/35 MS: 1 CrossOver- 00:08:33.945 [2024-12-16 10:55:32.313244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.313270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.313334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.313348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.313408] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.313422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.313481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.313494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.313555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.313569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.945 #35 NEW cov: 11778 ft: 13153 corp: 11/335b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeByte- 00:08:33.945 [2024-12-16 10:55:32.353262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.353288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.353347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.353361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.353422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.353435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.353497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.353511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.945 #36 NEW cov: 11778 ft: 13165 corp: 12/367b lim: 35 exec/s: 0 rss: 67Mb L: 32/35 MS: 1 EraseBytes- 00:08:33.945 [2024-12-16 10:55:32.393311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.393336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.393400] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.393414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.393475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.393488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.393545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.393558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.945 #37 NEW cov: 11778 ft: 13175 corp: 13/395b lim: 35 exec/s: 0 rss: 67Mb L: 28/35 MS: 1 ChangeBinInt- 00:08:33.945 [2024-12-16 10:55:32.433442] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.433468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.433546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.433560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.433623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.433638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.433681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.433694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.945 #38 NEW cov: 11778 ft: 13200 corp: 14/429b lim: 35 exec/s: 0 rss: 67Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:33.945 [2024-12-16 10:55:32.473642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.473666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.473742] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.473757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.473827] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.473840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.473900] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.473913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.473973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.473987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:33.945 #39 NEW cov: 11778 ft: 13299 corp: 15/464b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CopyPart- 00:08:33.945 [2024-12-16 10:55:32.513538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.513563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.513643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.513658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.513718] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.513732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.945 #40 NEW cov: 11778 ft: 13665 corp: 16/490b lim: 35 exec/s: 0 rss: 67Mb L: 26/35 MS: 1 EraseBytes- 00:08:33.945 [2024-12-16 10:55:32.553705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.553730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.945 [2024-12-16 10:55:32.553793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.945 [2024-12-16 10:55:32.553807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.205 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.205 #41 NEW cov: 11801 ft: 13804 corp: 17/517b lim: 35 exec/s: 0 rss: 67Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:08:34.205 [2024-12-16 10:55:32.593870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.593894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.205 [2024-12-16 10:55:32.593956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.593969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.205 [2024-12-16 10:55:32.594029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.594042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.205 [2024-12-16 10:55:32.594102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.594115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.205 #47 NEW cov: 11801 ft: 13817 corp: 18/551b lim: 35 exec/s: 0 rss: 67Mb L: 34/35 MS: 1 ChangeBit- 00:08:34.205 [2024-12-16 10:55:32.634069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.634094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.205 [2024-12-16 10:55:32.634154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.634168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.205 [2024-12-16 10:55:32.634231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.634245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.205 #48 NEW cov: 11801 ft: 13838 corp: 19/579b lim: 35 exec/s: 0 rss: 67Mb L: 28/35 MS: 1 InsertByte- 00:08:34.205 [2024-12-16 10:55:32.673890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.673915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.205 [2024-12-16 10:55:32.673975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.673989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.205 #49 NEW cov: 11801 ft: 14033 corp: 20/597b lim: 35 exec/s: 49 rss: 67Mb L: 18/35 MS: 1 CrossOver- 00:08:34.205 [2024-12-16 10:55:32.714386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.205 [2024-12-16 10:55:32.714411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.714471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.714485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.714544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.714558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.714623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.714637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.714697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.714710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.206 #50 NEW cov: 11801 ft: 14044 corp: 21/632b lim: 35 exec/s: 50 rss: 68Mb L: 35/35 MS: 1 ChangeBit- 00:08:34.206 [2024-12-16 10:55:32.754497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.754522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.754601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.754619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.754682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.754695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.754756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.754769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.754833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.754848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.206 #51 NEW cov: 11801 ft: 14115 corp: 22/667b lim: 35 exec/s: 51 rss: 68Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:34.206 [2024-12-16 10:55:32.794648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.794673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.794751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.794765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.794822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.794836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.794897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.794910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.206 [2024-12-16 10:55:32.794969] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.206 [2024-12-16 10:55:32.794983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.206 #52 NEW cov: 11801 ft: 14175 corp: 23/702b lim: 35 exec/s: 52 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:08:34.466 [2024-12-16 10:55:32.834578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.834604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.834670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000039 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.834684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.466 #54 NEW cov: 11801 ft: 14248 corp: 24/726b lim: 35 exec/s: 54 rss: 68Mb L: 24/35 MS: 2 InsertByte-CrossOver- 00:08:34.466 [2024-12-16 10:55:32.874745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.874769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.874832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.874846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.874906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.874919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.874981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.874998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.466 #55 NEW cov: 11801 ft: 14261 corp: 25/756b lim: 35 exec/s: 55 rss: 68Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:08:34.466 [2024-12-16 10:55:32.914897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.914922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.914985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.914998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.915060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.915073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.915134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.915148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.466 #56 NEW cov: 11801 ft: 14289 corp: 26/784b lim: 35 exec/s: 56 rss: 68Mb L: 28/35 MS: 1 ChangeBinInt- 00:08:34.466 [2024-12-16 10:55:32.954879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.954904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.954964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.954978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.955039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.955052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.466 #57 NEW cov: 11801 ft: 14414 corp: 27/806b lim: 35 exec/s: 57 rss: 68Mb L: 22/35 MS: 1 EraseBytes- 00:08:34.466 [2024-12-16 10:55:32.995175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.995200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.995260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.995274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.995334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.995347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.466 [2024-12-16 10:55:32.995407] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:32.995420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.466 #58 NEW cov: 11801 ft: 14424 corp: 28/838b lim: 35 exec/s: 58 rss: 68Mb L: 32/35 MS: 1 ChangeByte- 00:08:34.466 #59 NEW cov: 11801 ft: 14638 corp: 29/851b lim: 35 exec/s: 59 rss: 68Mb L: 13/35 MS: 1 EraseBytes- 00:08:34.466 [2024-12-16 10:55:33.075525] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.466 [2024-12-16 10:55:33.075549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.467 [2024-12-16 10:55:33.075630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.467 [2024-12-16 10:55:33.075644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.467 [2024-12-16 10:55:33.075703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.467 [2024-12-16 10:55:33.075717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.467 [2024-12-16 10:55:33.075780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.467 [2024-12-16 10:55:33.075793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.467 [2024-12-16 10:55:33.075851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.467 [2024-12-16 10:55:33.075863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.726 #60 NEW cov: 11801 ft: 14650 corp: 30/886b lim: 35 exec/s: 60 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:08:34.726 [2024-12-16 10:55:33.115350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.726 [2024-12-16 10:55:33.115374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.726 [2024-12-16 10:55:33.115435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.115448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.115510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.115523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.727 #61 NEW cov: 11801 ft: 14654 corp: 31/909b lim: 35 exec/s: 61 rss: 68Mb L: 23/35 MS: 1 InsertByte- 00:08:34.727 [2024-12-16 10:55:33.155472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.155496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.155557] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.155571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.155633] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.155663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.727 #62 NEW cov: 11801 ft: 14658 corp: 32/935b lim: 35 exec/s: 62 rss: 68Mb L: 26/35 MS: 1 CMP- DE: "\015\000\000\000"- 00:08:34.727 [2024-12-16 10:55:33.195726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.195753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.195831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.195845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.195908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.195921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.195981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.195995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.727 #63 NEW cov: 11801 ft: 14685 corp: 33/964b lim: 35 exec/s: 63 rss: 68Mb L: 29/35 MS: 1 EraseBytes- 00:08:34.727 [2024-12-16 10:55:33.235955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.235979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.236043] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.236056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.236119] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.236132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.236194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.236207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.236267] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.236280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.727 #64 NEW cov: 11801 ft: 14691 corp: 34/999b lim: 35 exec/s: 64 rss: 68Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:34.727 #65 NEW cov: 11801 ft: 14710 corp: 35/1011b lim: 35 exec/s: 65 rss: 68Mb L: 12/35 MS: 1 CrossOver- 00:08:34.727 [2024-12-16 10:55:33.316192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.316217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.316280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.316294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.316353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.316367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.316427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.316443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.727 [2024-12-16 10:55:33.316505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.727 [2024-12-16 10:55:33.316518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.727 #66 NEW cov: 11801 ft: 14726 corp: 36/1046b lim: 35 exec/s: 66 rss: 68Mb L: 35/35 MS: 1 ChangeByte- 00:08:34.988 [2024-12-16 10:55:33.356314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.356339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.356400] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.356414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.356490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.356504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.356567] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.356580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.356643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.356657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.988 #67 NEW cov: 11801 ft: 14765 corp: 37/1081b lim: 35 exec/s: 67 rss: 68Mb L: 35/35 MS: 1 ChangeBit- 00:08:34.988 [2024-12-16 10:55:33.396168] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.396192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.396255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.396269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.396327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.396341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.988 #68 NEW cov: 11801 ft: 14810 corp: 38/1106b lim: 35 exec/s: 68 rss: 68Mb L: 25/35 MS: 1 EraseBytes- 00:08:34.988 [2024-12-16 10:55:33.436496] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.436521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.436586] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.436601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.436671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.436687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.988 #69 NEW cov: 11801 ft: 14817 corp: 39/1134b lim: 35 exec/s: 69 rss: 68Mb L: 28/35 MS: 1 ShuffleBytes- 00:08:34.988 [2024-12-16 10:55:33.476554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.476581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.476659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.476674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.476735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.476748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.476809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.476822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.506634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.506658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.506724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.506738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.506811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.506826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.506888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.506903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.988 #71 NEW cov: 11801 ft: 14909 corp: 40/1168b lim: 35 exec/s: 71 rss: 68Mb L: 34/35 MS: 2 ShuffleBytes-ChangeBinInt- 00:08:34.988 [2024-12-16 10:55:33.546868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.546894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.546974] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.546988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.547048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.547062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.547121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.547138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.988 [2024-12-16 10:55:33.547200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.547214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:34.988 #72 NEW cov: 11801 ft: 14917 corp: 41/1203b lim: 35 exec/s: 72 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:08:34.988 [2024-12-16 10:55:33.586616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.988 [2024-12-16 10:55:33.586642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.988 #73 NEW cov: 11801 ft: 15051 corp: 42/1223b lim: 35 exec/s: 73 rss: 68Mb L: 20/35 MS: 1 EraseBytes- 00:08:35.248 [2024-12-16 10:55:33.626973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.248 [2024-12-16 10:55:33.626999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.248 [2024-12-16 10:55:33.627077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.248 [2024-12-16 10:55:33.627092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.248 [2024-12-16 10:55:33.627156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.248 [2024-12-16 10:55:33.627170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.248 [2024-12-16 10:55:33.627232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.248 [2024-12-16 10:55:33.627245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.248 #74 NEW cov: 11801 ft: 15079 corp: 43/1252b lim: 35 exec/s: 74 rss: 69Mb L: 29/35 MS: 1 ChangeBinInt- 00:08:35.248 [2024-12-16 10:55:33.667116] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.248 [2024-12-16 10:55:33.667141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.249 [2024-12-16 10:55:33.667203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.249 [2024-12-16 10:55:33.667216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.249 [2024-12-16 10:55:33.667279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.249 [2024-12-16 10:55:33.667293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.249 [2024-12-16 10:55:33.667353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.249 [2024-12-16 10:55:33.667367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.249 #75 NEW cov: 11801 ft: 15083 corp: 44/1284b lim: 35 exec/s: 37 rss: 69Mb L: 32/35 MS: 1 CopyPart- 00:08:35.249 #75 DONE cov: 11801 ft: 15083 corp: 44/1284b lim: 35 exec/s: 37 rss: 69Mb 00:08:35.249 ###### Recommended dictionary. ###### 00:08:35.249 "\015\000\000\000" # Uses: 0 00:08:35.249 ###### End of recommended dictionary. ###### 00:08:35.249 Done 75 runs in 2 second(s) 00:08:35.249 10:55:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:35.249 10:55:33 -- ../common.sh@72 -- # (( i++ )) 00:08:35.249 10:55:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.249 10:55:33 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:35.249 10:55:33 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:35.249 10:55:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:35.249 10:55:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.249 10:55:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:35.249 10:55:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:35.249 10:55:33 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:35.249 10:55:33 -- nvmf/run.sh@29 -- # port=4416 00:08:35.249 10:55:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:35.249 10:55:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:35.249 10:55:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.249 10:55:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:35.249 [2024-12-16 10:55:33.847648] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:35.249 [2024-12-16 10:55:33.847717] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658184 ] 00:08:35.508 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.508 [2024-12-16 10:55:34.035554] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.508 [2024-12-16 10:55:34.054917] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.508 [2024-12-16 10:55:34.055058] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.508 [2024-12-16 10:55:34.106350] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.508 [2024-12-16 10:55:34.122682] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:35.768 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.768 INFO: Seed: 2886256256 00:08:35.768 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:35.768 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:35.768 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:35.768 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.768 #2 INITED exec/s: 0 rss: 59Mb 00:08:35.768 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.768 This may also happen if the target rejected all inputs we tried so far 00:08:35.768 [2024-12-16 10:55:34.178085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.768 [2024-12-16 10:55:34.178115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.768 [2024-12-16 10:55:34.178154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.768 [2024-12-16 10:55:34.178171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.768 [2024-12-16 10:55:34.178226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.768 [2024-12-16 10:55:34.178241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.768 [2024-12-16 10:55:34.178295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.768 [2024-12-16 10:55:34.178313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.028 NEW_FUNC[1/671]: 0x46ed68 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:36.028 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.028 #19 NEW cov: 11663 ft: 11662 corp: 2/86b lim: 105 exec/s: 0 rss: 66Mb L: 85/85 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:36.028 [2024-12-16 10:55:34.478772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.478804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.478847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.478862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.478914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.478928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.478981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.478996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.028 #21 NEW cov: 11776 ft: 12230 corp: 3/179b lim: 105 exec/s: 0 rss: 66Mb L: 93/93 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:36.028 [2024-12-16 10:55:34.518785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.518813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.518851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.518867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.518920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.518934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.518986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476429600881083267 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.519001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.028 #22 NEW cov: 11782 ft: 12470 corp: 4/273b lim: 105 exec/s: 0 rss: 66Mb L: 94/94 MS: 1 CrossOver- 00:08:36.028 [2024-12-16 10:55:34.558948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.558974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.559032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8755986701408764803 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.559051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.559103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.559117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.559167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.559181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.028 #23 NEW cov: 11867 ft: 12683 corp: 5/367b lim: 105 exec/s: 0 rss: 66Mb L: 94/94 MS: 1 InsertByte- 00:08:36.028 [2024-12-16 10:55:34.598922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.598950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.598988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.599003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.599057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.599072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.028 #24 NEW cov: 11867 ft: 13267 corp: 6/436b lim: 105 exec/s: 0 rss: 66Mb L: 69/94 MS: 1 InsertRepeatedBytes- 00:08:36.028 [2024-12-16 10:55:34.639107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.639134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.639174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.639190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.639241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.639256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.028 [2024-12-16 10:55:34.639306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.028 [2024-12-16 10:55:34.639321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.288 #25 NEW cov: 11867 ft: 13301 corp: 7/529b lim: 105 exec/s: 0 rss: 66Mb L: 93/94 MS: 1 ChangeBit- 00:08:36.288 [2024-12-16 10:55:34.679091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.288 [2024-12-16 10:55:34.679119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.288 [2024-12-16 10:55:34.679163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.288 [2024-12-16 10:55:34.679179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.288 [2024-12-16 10:55:34.679232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.288 [2024-12-16 10:55:34.679247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.288 #26 NEW cov: 11867 ft: 13440 corp: 8/598b lim: 105 exec/s: 0 rss: 66Mb L: 69/94 MS: 1 ShuffleBytes- 00:08:36.288 [2024-12-16 10:55:34.719201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.288 [2024-12-16 10:55:34.719228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.288 [2024-12-16 10:55:34.719264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.288 [2024-12-16 10:55:34.719279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.288 [2024-12-16 10:55:34.719332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.288 [2024-12-16 10:55:34.719347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.289 #27 NEW cov: 11867 ft: 13465 corp: 9/668b lim: 105 exec/s: 0 rss: 66Mb L: 70/94 MS: 1 InsertByte- 00:08:36.289 [2024-12-16 10:55:34.759436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.759463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.759519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.759535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.759585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.759599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.759673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.759689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.289 #28 NEW cov: 11867 ft: 13495 corp: 10/761b lim: 105 exec/s: 0 rss: 66Mb L: 93/94 MS: 1 ChangeBinInt- 00:08:36.289 [2024-12-16 10:55:34.789555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.789582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.789640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562639581610115 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.789662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.789723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.789738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.789790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.789805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.289 #29 NEW cov: 11867 ft: 13517 corp: 11/858b lim: 105 exec/s: 0 rss: 66Mb L: 97/97 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:36.289 [2024-12-16 10:55:34.829700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.829727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.829786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.829802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.829854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.829868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.829922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476429600881083267 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.829937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.289 #30 NEW cov: 11867 ft: 13602 corp: 12/952b lim: 105 exec/s: 0 rss: 66Mb L: 94/97 MS: 1 ShuffleBytes- 00:08:36.289 [2024-12-16 10:55:34.869848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.869874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.869935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.869951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.870002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.870017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.870070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.870084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.289 #31 NEW cov: 11867 ft: 13638 corp: 13/1045b lim: 105 exec/s: 0 rss: 66Mb L: 93/97 MS: 1 ShuffleBytes- 00:08:36.289 [2024-12-16 10:55:34.909956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.909982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.910038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562639581610115 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.910054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.910107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.910122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.289 [2024-12-16 10:55:34.910173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.289 [2024-12-16 10:55:34.910187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.549 #32 NEW cov: 11867 ft: 13666 corp: 14/1142b lim: 105 exec/s: 0 rss: 66Mb L: 97/97 MS: 1 ChangeByte- 00:08:36.549 [2024-12-16 10:55:34.950057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.950084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:34.950139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.950155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:34.950208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.950224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:34.950276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744070421217279 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.950290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.549 #33 NEW cov: 11867 ft: 13762 corp: 15/1235b lim: 105 exec/s: 0 rss: 66Mb L: 93/97 MS: 1 CopyPart- 00:08:36.549 [2024-12-16 10:55:34.990158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.990185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:34.990227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8755986701408764803 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.990242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:34.990294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.990308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:34.990362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:34.990377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.549 #34 NEW cov: 11867 ft: 13780 corp: 16/1329b lim: 105 exec/s: 0 rss: 66Mb L: 94/97 MS: 1 ChangeBit- 00:08:36.549 [2024-12-16 10:55:35.030234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:35.030261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:35.030321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641620272003 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:35.030337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.549 [2024-12-16 10:55:35.030389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.549 [2024-12-16 10:55:35.030404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.030455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.030470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.550 #35 NEW cov: 11867 ft: 13810 corp: 17/1419b lim: 105 exec/s: 0 rss: 66Mb L: 90/97 MS: 1 EraseBytes- 00:08:36.550 [2024-12-16 10:55:35.070360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.070386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.070448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.070464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.070516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1912602624 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.070531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.070582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.070597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.550 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.550 #36 NEW cov: 11890 ft: 13909 corp: 18/1505b lim: 105 exec/s: 0 rss: 67Mb L: 86/97 MS: 1 InsertByte- 00:08:36.550 [2024-12-16 10:55:35.110517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.110543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.110589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.110604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.110663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.110677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.110731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.110746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.550 #37 NEW cov: 11890 ft: 13962 corp: 19/1598b lim: 105 exec/s: 0 rss: 67Mb L: 93/97 MS: 1 CopyPart- 00:08:36.550 [2024-12-16 10:55:35.150623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.150649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.150710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562639581610115 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.150726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.150778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.150793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.550 [2024-12-16 10:55:35.150845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.550 [2024-12-16 10:55:35.150860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.550 #38 NEW cov: 11890 ft: 13979 corp: 20/1695b lim: 105 exec/s: 38 rss: 67Mb L: 97/97 MS: 1 ChangeBinInt- 00:08:36.808 [2024-12-16 10:55:35.190528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.808 [2024-12-16 10:55:35.190557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.808 [2024-12-16 10:55:35.190616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.808 [2024-12-16 10:55:35.190636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.808 #39 NEW cov: 11890 ft: 14318 corp: 21/1750b lim: 105 exec/s: 39 rss: 67Mb L: 55/97 MS: 1 EraseBytes- 00:08:36.808 [2024-12-16 10:55:35.230767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65531 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.808 [2024-12-16 10:55:35.230795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.808 [2024-12-16 10:55:35.230830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.808 [2024-12-16 10:55:35.230846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.808 [2024-12-16 10:55:35.230897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.808 [2024-12-16 10:55:35.230912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.808 #40 NEW cov: 11890 ft: 14322 corp: 22/1819b lim: 105 exec/s: 40 rss: 67Mb L: 69/97 MS: 1 ChangeBinInt- 00:08:36.808 [2024-12-16 10:55:35.270993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.808 [2024-12-16 10:55:35.271026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.808 [2024-12-16 10:55:35.271063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.271079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.271129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.271144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.271197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.271212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.809 #41 NEW cov: 11890 ft: 14331 corp: 23/1910b lim: 105 exec/s: 41 rss: 67Mb L: 91/97 MS: 1 InsertRepeatedBytes- 00:08:36.809 [2024-12-16 10:55:35.311138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.311164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.311218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.311231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.311285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1912602624 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.311300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.311353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.311368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.809 #42 NEW cov: 11890 ft: 14340 corp: 24/2000b lim: 105 exec/s: 42 rss: 67Mb L: 90/97 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:36.809 [2024-12-16 10:55:35.351279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65531 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.351306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.351343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.351356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.351405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.351420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.351474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.351491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.809 #43 NEW cov: 11890 ft: 14357 corp: 25/2087b lim: 105 exec/s: 43 rss: 67Mb L: 87/97 MS: 1 CopyPart- 00:08:36.809 [2024-12-16 10:55:35.391380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:176390912 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.391407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.391445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.391459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.391502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.391516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.391567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.391582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.809 #49 NEW cov: 11890 ft: 14372 corp: 26/2189b lim: 105 exec/s: 49 rss: 67Mb L: 102/102 MS: 1 CrossOver- 00:08:36.809 [2024-12-16 10:55:35.431502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:176390912 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.431528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.431584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.431599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.431658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.431674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.809 [2024-12-16 10:55:35.431726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.809 [2024-12-16 10:55:35.431742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.068 #50 NEW cov: 11890 ft: 14390 corp: 27/2291b lim: 105 exec/s: 50 rss: 67Mb L: 102/102 MS: 1 ChangeBinInt- 00:08:37.068 [2024-12-16 10:55:35.471597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.471647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.471711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18014398509481983999 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.471733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.471796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.471820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.471872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.471892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.068 #51 NEW cov: 11890 ft: 14402 corp: 28/2382b lim: 105 exec/s: 51 rss: 67Mb L: 91/102 MS: 1 ChangeBinInt- 00:08:37.068 [2024-12-16 10:55:35.511762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.511789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.511836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.511852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.511902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1912602624 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.511918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.511968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.511983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.068 #52 NEW cov: 11890 ft: 14408 corp: 29/2479b lim: 105 exec/s: 52 rss: 67Mb L: 97/102 MS: 1 InsertRepeatedBytes- 00:08:37.068 [2024-12-16 10:55:35.551758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.551784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.551844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.551860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.551914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.551929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.068 #53 NEW cov: 11890 ft: 14470 corp: 30/2552b lim: 105 exec/s: 53 rss: 67Mb L: 73/102 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:37.068 [2024-12-16 10:55:35.591979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.592005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.592054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.592070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.592121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.592154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.592206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.592221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.068 #54 NEW cov: 11890 ft: 14522 corp: 31/2645b lim: 105 exec/s: 54 rss: 67Mb L: 93/102 MS: 1 ChangeBinInt- 00:08:37.068 [2024-12-16 10:55:35.632108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.632134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.632179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.632194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.632247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.632262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.068 [2024-12-16 10:55:35.632313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476429600881115011 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.068 [2024-12-16 10:55:35.632328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.068 #55 NEW cov: 11890 ft: 14546 corp: 32/2739b lim: 105 exec/s: 55 rss: 67Mb L: 94/102 MS: 1 ChangeByte- 00:08:37.069 [2024-12-16 10:55:35.671987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6004234345728136019 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.069 [2024-12-16 10:55:35.672013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.069 [2024-12-16 10:55:35.672049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6004234345560363859 len:21332 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.069 [2024-12-16 10:55:35.672065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.328 #58 NEW cov: 11890 ft: 14572 corp: 33/2789b lim: 105 exec/s: 58 rss: 67Mb L: 50/102 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:37.328 [2024-12-16 10:55:35.712198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65531 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.712224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.328 [2024-12-16 10:55:35.712263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4294967295 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.712278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.328 [2024-12-16 10:55:35.712331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.712347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.328 #59 NEW cov: 11890 ft: 14589 corp: 34/2862b lim: 105 exec/s: 59 rss: 67Mb L: 73/102 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:37.328 [2024-12-16 10:55:35.752399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.752426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.328 [2024-12-16 10:55:35.752471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.752485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.328 [2024-12-16 10:55:35.752537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.752551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.328 [2024-12-16 10:55:35.752604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.328 [2024-12-16 10:55:35.752622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.328 #60 NEW cov: 11890 ft: 14614 corp: 35/2953b lim: 105 exec/s: 60 rss: 67Mb L: 91/102 MS: 1 ChangeByte- 00:08:37.329 [2024-12-16 10:55:35.792507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.792533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.792582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.792596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.792666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.792682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.792734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.792749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.329 #61 NEW cov: 11890 ft: 14631 corp: 36/3046b lim: 105 exec/s: 61 rss: 67Mb L: 93/102 MS: 1 CopyPart- 00:08:37.329 [2024-12-16 10:55:35.832547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.832574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.832622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044179 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.832638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.832690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.832706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.329 #62 NEW cov: 11890 ft: 14644 corp: 37/3126b lim: 105 exec/s: 62 rss: 67Mb L: 80/102 MS: 1 EraseBytes- 00:08:37.329 [2024-12-16 10:55:35.872826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.872853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.872893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.872908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.872959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3166485504 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.872974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.873026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.873041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.329 #63 NEW cov: 11890 ft: 14645 corp: 38/3223b lim: 105 exec/s: 63 rss: 67Mb L: 97/102 MS: 1 InsertRepeatedBytes- 00:08:37.329 [2024-12-16 10:55:35.912688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.912714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.329 [2024-12-16 10:55:35.912764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.329 [2024-12-16 10:55:35.912780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.329 #64 NEW cov: 11890 ft: 14665 corp: 39/3278b lim: 105 exec/s: 64 rss: 67Mb L: 55/102 MS: 1 ChangeByte- 00:08:37.589 [2024-12-16 10:55:35.953040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.953068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:35.953121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8755986701408764803 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.953137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:35.953189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.953205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:35.953255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.953271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.589 #65 NEW cov: 11890 ft: 14694 corp: 40/3372b lim: 105 exec/s: 65 rss: 67Mb L: 94/102 MS: 1 ChangeBit- 00:08:37.589 [2024-12-16 10:55:35.993171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.993198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:35.993239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.993255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:35.993306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.993321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:35.993374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476429600881115011 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:35.993388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.589 #66 NEW cov: 11890 ft: 14699 corp: 41/3466b lim: 105 exec/s: 66 rss: 68Mb L: 94/102 MS: 1 ChangeByte- 00:08:37.589 [2024-12-16 10:55:36.033236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:176390912 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.033262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.033315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.033330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.033379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.033395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.033446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.033461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.589 #67 NEW cov: 11890 ft: 14706 corp: 42/3552b lim: 105 exec/s: 67 rss: 68Mb L: 86/102 MS: 1 EraseBytes- 00:08:37.589 [2024-12-16 10:55:36.073125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.073151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.073188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.073203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.589 #68 NEW cov: 11890 ft: 14708 corp: 43/3607b lim: 105 exec/s: 68 rss: 68Mb L: 55/102 MS: 1 ChangeByte- 00:08:37.589 [2024-12-16 10:55:36.113499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562639758001027 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.113525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.113573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562639581610115 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.113589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.113664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.113680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.589 [2024-12-16 10:55:36.113733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:9476562641620272003 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.589 [2024-12-16 10:55:36.113747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.589 #69 NEW cov: 11890 ft: 14713 corp: 44/3704b lim: 105 exec/s: 69 rss: 68Mb L: 97/102 MS: 1 CrossOver- 00:08:37.590 [2024-12-16 10:55:36.153367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.590 [2024-12-16 10:55:36.153393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.590 [2024-12-16 10:55:36.153430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.590 [2024-12-16 10:55:36.153445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.590 #70 NEW cov: 11890 ft: 14723 corp: 45/3753b lim: 105 exec/s: 35 rss: 68Mb L: 49/102 MS: 1 EraseBytes- 00:08:37.590 #70 DONE cov: 11890 ft: 14723 corp: 45/3753b lim: 105 exec/s: 35 rss: 68Mb 00:08:37.590 ###### Recommended dictionary. ###### 00:08:37.590 "\000\000\000\000" # Uses: 4 00:08:37.590 ###### End of recommended dictionary. ###### 00:08:37.590 Done 70 runs in 2 second(s) 00:08:37.849 10:55:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:37.849 10:55:36 -- ../common.sh@72 -- # (( i++ )) 00:08:37.849 10:55:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.849 10:55:36 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:37.849 10:55:36 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:37.849 10:55:36 -- nvmf/run.sh@24 -- # local timen=1 00:08:37.849 10:55:36 -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.849 10:55:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:37.849 10:55:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:37.849 10:55:36 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:37.849 10:55:36 -- nvmf/run.sh@29 -- # port=4417 00:08:37.849 10:55:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:37.849 10:55:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:37.849 10:55:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.849 10:55:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:37.849 [2024-12-16 10:55:36.332485] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:37.849 [2024-12-16 10:55:36.332575] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid658525 ] 00:08:37.849 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.109 [2024-12-16 10:55:36.508712] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.109 [2024-12-16 10:55:36.528447] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.109 [2024-12-16 10:55:36.528584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.109 [2024-12-16 10:55:36.580084] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.109 [2024-12-16 10:55:36.596414] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:38.109 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.109 INFO: Seed: 1064276532 00:08:38.109 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:38.109 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:38.109 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:38.109 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.109 #2 INITED exec/s: 0 rss: 59Mb 00:08:38.109 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.109 This may also happen if the target rejected all inputs we tried so far 00:08:38.109 [2024-12-16 10:55:36.662206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.109 [2024-12-16 10:55:36.662249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.368 NEW_FUNC[1/672]: 0x472058 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:38.368 NEW_FUNC[2/672]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.368 #6 NEW cov: 11684 ft: 11685 corp: 2/36b lim: 120 exec/s: 0 rss: 66Mb L: 35/35 MS: 4 ShuffleBytes-CrossOver-InsertByte-InsertRepeatedBytes- 00:08:38.627 [2024-12-16 10:55:36.993224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.627 [2024-12-16 10:55:36.993282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.627 #13 NEW cov: 11797 ft: 12167 corp: 3/82b lim: 120 exec/s: 0 rss: 66Mb L: 46/46 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:38.627 [2024-12-16 10:55:37.043201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641232745010 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.627 [2024-12-16 10:55:37.043229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.627 #15 NEW cov: 11803 ft: 12432 corp: 4/114b lim: 120 exec/s: 0 rss: 66Mb L: 32/46 MS: 2 ShuffleBytes-CrossOver- 00:08:38.627 [2024-12-16 10:55:37.093379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.627 [2024-12-16 10:55:37.093407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.627 #16 NEW cov: 11888 ft: 12726 corp: 5/160b lim: 120 exec/s: 0 rss: 66Mb L: 46/46 MS: 1 CMP- DE: "\377\005"- 00:08:38.628 [2024-12-16 10:55:37.143630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.628 [2024-12-16 10:55:37.143660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.628 #17 NEW cov: 11888 ft: 12831 corp: 6/196b lim: 120 exec/s: 0 rss: 66Mb L: 36/46 MS: 1 InsertByte- 00:08:38.628 [2024-12-16 10:55:37.193763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.628 [2024-12-16 10:55:37.193792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.628 #18 NEW cov: 11888 ft: 12897 corp: 7/242b lim: 120 exec/s: 0 rss: 66Mb L: 46/46 MS: 1 CopyPart- 00:08:38.628 [2024-12-16 10:55:37.243980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.628 [2024-12-16 10:55:37.244017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.887 #19 NEW cov: 11888 ft: 12957 corp: 8/278b lim: 120 exec/s: 0 rss: 66Mb L: 36/46 MS: 1 CopyPart- 00:08:38.887 [2024-12-16 10:55:37.294071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.887 [2024-12-16 10:55:37.294097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.887 #20 NEW cov: 11888 ft: 13004 corp: 9/324b lim: 120 exec/s: 0 rss: 66Mb L: 46/46 MS: 1 ShuffleBytes- 00:08:38.887 [2024-12-16 10:55:37.344173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.887 [2024-12-16 10:55:37.344204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.887 #21 NEW cov: 11888 ft: 13040 corp: 10/360b lim: 120 exec/s: 0 rss: 67Mb L: 36/46 MS: 1 ShuffleBytes- 00:08:38.887 [2024-12-16 10:55:37.394413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.887 [2024-12-16 10:55:37.394451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.887 #22 NEW cov: 11888 ft: 13114 corp: 11/395b lim: 120 exec/s: 0 rss: 67Mb L: 35/46 MS: 1 ChangeBinInt- 00:08:38.887 [2024-12-16 10:55:37.444573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.887 [2024-12-16 10:55:37.444599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.887 #28 NEW cov: 11888 ft: 13140 corp: 12/431b lim: 120 exec/s: 0 rss: 67Mb L: 36/46 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\003"- 00:08:38.887 [2024-12-16 10:55:37.494738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.887 [2024-12-16 10:55:37.494764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.146 #29 NEW cov: 11888 ft: 13162 corp: 13/466b lim: 120 exec/s: 0 rss: 67Mb L: 35/46 MS: 1 ChangeBinInt- 00:08:39.146 [2024-12-16 10:55:37.544892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.146 [2024-12-16 10:55:37.544918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.146 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.146 #30 NEW cov: 11911 ft: 13225 corp: 14/512b lim: 120 exec/s: 0 rss: 67Mb L: 46/46 MS: 1 CrossOver- 00:08:39.146 [2024-12-16 10:55:37.595089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.146 [2024-12-16 10:55:37.595118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.146 #31 NEW cov: 11911 ft: 13286 corp: 15/548b lim: 120 exec/s: 0 rss: 67Mb L: 36/46 MS: 1 PersAutoDict- DE: "\377\005"- 00:08:39.146 [2024-12-16 10:55:37.645204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641232745010 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.146 [2024-12-16 10:55:37.645233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.146 #32 NEW cov: 11911 ft: 13293 corp: 16/580b lim: 120 exec/s: 32 rss: 67Mb L: 32/46 MS: 1 ChangeASCIIInt- 00:08:39.146 [2024-12-16 10:55:37.695354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.146 [2024-12-16 10:55:37.695387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.146 #33 NEW cov: 11911 ft: 13300 corp: 17/617b lim: 120 exec/s: 33 rss: 67Mb L: 37/46 MS: 1 InsertByte- 00:08:39.146 [2024-12-16 10:55:37.745626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.146 [2024-12-16 10:55:37.745665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.405 #34 NEW cov: 11911 ft: 13319 corp: 18/664b lim: 120 exec/s: 34 rss: 67Mb L: 47/47 MS: 1 InsertByte- 00:08:39.406 [2024-12-16 10:55:37.795721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.795748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.406 #35 NEW cov: 11911 ft: 13325 corp: 19/700b lim: 120 exec/s: 35 rss: 67Mb L: 36/47 MS: 1 ChangeByte- 00:08:39.406 [2024-12-16 10:55:37.845830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.845857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.406 #36 NEW cov: 11911 ft: 13329 corp: 20/737b lim: 120 exec/s: 36 rss: 67Mb L: 37/47 MS: 1 ChangeBit- 00:08:39.406 [2024-12-16 10:55:37.896941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.896973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.406 [2024-12-16 10:55:37.897069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.897095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.406 [2024-12-16 10:55:37.897220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.897245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.406 [2024-12-16 10:55:37.897377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.897399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.406 #37 NEW cov: 11911 ft: 14236 corp: 21/856b lim: 120 exec/s: 37 rss: 67Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:08:39.406 [2024-12-16 10:55:37.956263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:37.956290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.406 #38 NEW cov: 11911 ft: 14344 corp: 22/896b lim: 120 exec/s: 38 rss: 67Mb L: 40/119 MS: 1 CMP- DE: "\365\377\377\377"- 00:08:39.406 [2024-12-16 10:55:38.016504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.406 [2024-12-16 10:55:38.016533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.665 #39 NEW cov: 11911 ft: 14405 corp: 23/932b lim: 120 exec/s: 39 rss: 67Mb L: 36/119 MS: 1 ChangeByte- 00:08:39.665 [2024-12-16 10:55:38.066702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.665 [2024-12-16 10:55:38.066729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.665 #40 NEW cov: 11911 ft: 14426 corp: 24/968b lim: 120 exec/s: 40 rss: 67Mb L: 36/119 MS: 1 ShuffleBytes- 00:08:39.665 [2024-12-16 10:55:38.116875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.665 [2024-12-16 10:55:38.116903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.665 #41 NEW cov: 11911 ft: 14455 corp: 25/1005b lim: 120 exec/s: 41 rss: 67Mb L: 37/119 MS: 1 PersAutoDict- DE: "\377\005"- 00:08:39.665 [2024-12-16 10:55:38.176996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.665 [2024-12-16 10:55:38.177024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.665 #47 NEW cov: 11911 ft: 14459 corp: 26/1050b lim: 120 exec/s: 47 rss: 67Mb L: 45/119 MS: 1 CMP- DE: "\025\000\000\000\000\000\000\000"- 00:08:39.665 [2024-12-16 10:55:38.227175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.665 [2024-12-16 10:55:38.227203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.665 #48 NEW cov: 11911 ft: 14477 corp: 27/1094b lim: 120 exec/s: 48 rss: 68Mb L: 44/119 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\003"- 00:08:39.665 [2024-12-16 10:55:38.287471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641232745010 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.665 [2024-12-16 10:55:38.287500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.924 #49 NEW cov: 11911 ft: 14517 corp: 28/1126b lim: 120 exec/s: 49 rss: 68Mb L: 32/119 MS: 1 ShuffleBytes- 00:08:39.924 [2024-12-16 10:55:38.347566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.924 [2024-12-16 10:55:38.347594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.924 #50 NEW cov: 11911 ft: 14528 corp: 29/1171b lim: 120 exec/s: 50 rss: 68Mb L: 45/119 MS: 1 ShuffleBytes- 00:08:39.924 [2024-12-16 10:55:38.407642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.924 [2024-12-16 10:55:38.407691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.924 #51 NEW cov: 11911 ft: 14538 corp: 30/1209b lim: 120 exec/s: 51 rss: 68Mb L: 38/119 MS: 1 PersAutoDict- DE: "\377\005"- 00:08:39.924 [2024-12-16 10:55:38.457916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070438652671 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.925 [2024-12-16 10:55:38.457949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.925 #52 NEW cov: 11911 ft: 14552 corp: 31/1245b lim: 120 exec/s: 52 rss: 68Mb L: 36/119 MS: 1 ChangeBinInt- 00:08:39.925 [2024-12-16 10:55:38.508047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.925 [2024-12-16 10:55:38.508077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.925 #53 NEW cov: 11911 ft: 14556 corp: 32/1292b lim: 120 exec/s: 53 rss: 68Mb L: 47/119 MS: 1 ChangeBit- 00:08:40.183 [2024-12-16 10:55:38.568226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.183 [2024-12-16 10:55:38.568259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.183 #54 NEW cov: 11911 ft: 14571 corp: 33/1339b lim: 120 exec/s: 54 rss: 68Mb L: 47/119 MS: 1 ChangeByte- 00:08:40.183 [2024-12-16 10:55:38.618454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.183 [2024-12-16 10:55:38.618483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.183 [2024-12-16 10:55:38.668606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3617008641668952626 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.183 [2024-12-16 10:55:38.668642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.183 #56 NEW cov: 11911 ft: 14608 corp: 34/1385b lim: 120 exec/s: 28 rss: 68Mb L: 46/119 MS: 2 CMP-CopyPart- DE: "\001\005S1*+\021\220"- 00:08:40.183 #56 DONE cov: 11911 ft: 14608 corp: 34/1385b lim: 120 exec/s: 28 rss: 68Mb 00:08:40.183 ###### Recommended dictionary. ###### 00:08:40.183 "\377\005" # Uses: 4 00:08:40.183 "\001\000\000\000\000\000\000\003" # Uses: 1 00:08:40.183 "\365\377\377\377" # Uses: 0 00:08:40.183 "\025\000\000\000\000\000\000\000" # Uses: 0 00:08:40.183 "\001\005S1*+\021\220" # Uses: 0 00:08:40.183 ###### End of recommended dictionary. ###### 00:08:40.183 Done 56 runs in 2 second(s) 00:08:40.183 10:55:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:40.183 10:55:38 -- ../common.sh@72 -- # (( i++ )) 00:08:40.183 10:55:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.183 10:55:38 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:40.183 10:55:38 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:40.183 10:55:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:40.183 10:55:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.183 10:55:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:40.183 10:55:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:40.183 10:55:38 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:40.183 10:55:38 -- nvmf/run.sh@29 -- # port=4418 00:08:40.183 10:55:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:40.442 10:55:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:40.442 10:55:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.442 10:55:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:40.442 [2024-12-16 10:55:38.842012] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:40.442 [2024-12-16 10:55:38.842078] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659062 ] 00:08:40.442 EAL: No free 2048 kB hugepages reported on node 1 00:08:40.442 [2024-12-16 10:55:39.016239] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.442 [2024-12-16 10:55:39.035401] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:40.442 [2024-12-16 10:55:39.035527] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.702 [2024-12-16 10:55:39.086757] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.702 [2024-12-16 10:55:39.103047] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:40.702 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.702 INFO: Seed: 3572278378 00:08:40.702 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:40.702 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:40.702 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:40.702 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.702 #2 INITED exec/s: 0 rss: 59Mb 00:08:40.702 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.702 This may also happen if the target rejected all inputs we tried so far 00:08:40.702 [2024-12-16 10:55:39.148116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.702 [2024-12-16 10:55:39.148146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.962 NEW_FUNC[1/670]: 0x4758b8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:40.962 NEW_FUNC[2/670]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.962 #6 NEW cov: 11628 ft: 11627 corp: 2/35b lim: 100 exec/s: 0 rss: 66Mb L: 34/34 MS: 4 InsertByte-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:40.962 [2024-12-16 10:55:39.480015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.962 [2024-12-16 10:55:39.480056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.480180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.962 [2024-12-16 10:55:39.480206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.480322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.962 [2024-12-16 10:55:39.480348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.480466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:40.962 [2024-12-16 10:55:39.480488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.962 #7 NEW cov: 11741 ft: 12851 corp: 3/118b lim: 100 exec/s: 0 rss: 66Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:40.962 [2024-12-16 10:55:39.530111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.962 [2024-12-16 10:55:39.530143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.530264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.962 [2024-12-16 10:55:39.530285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.530391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.962 [2024-12-16 10:55:39.530412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.530529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:40.962 [2024-12-16 10:55:39.530553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.962 #8 NEW cov: 11747 ft: 13105 corp: 4/201b lim: 100 exec/s: 0 rss: 66Mb L: 83/83 MS: 1 ChangeByte- 00:08:40.962 [2024-12-16 10:55:39.580278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.962 [2024-12-16 10:55:39.580311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.580402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.962 [2024-12-16 10:55:39.580426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.580535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:40.962 [2024-12-16 10:55:39.580555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.962 [2024-12-16 10:55:39.580681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:40.962 [2024-12-16 10:55:39.580702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.222 #9 NEW cov: 11832 ft: 13339 corp: 5/285b lim: 100 exec/s: 0 rss: 66Mb L: 84/84 MS: 1 InsertByte- 00:08:41.222 [2024-12-16 10:55:39.630459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.222 [2024-12-16 10:55:39.630490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.630596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.222 [2024-12-16 10:55:39.630618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.630735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.222 [2024-12-16 10:55:39.630759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.630865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.222 [2024-12-16 10:55:39.630885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.222 #10 NEW cov: 11832 ft: 13398 corp: 6/368b lim: 100 exec/s: 0 rss: 66Mb L: 83/84 MS: 1 CopyPart- 00:08:41.222 [2024-12-16 10:55:39.670524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.222 [2024-12-16 10:55:39.670554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.670635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.222 [2024-12-16 10:55:39.670658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.670780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.222 [2024-12-16 10:55:39.670801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.670919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.222 [2024-12-16 10:55:39.670938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.222 #11 NEW cov: 11832 ft: 13472 corp: 7/452b lim: 100 exec/s: 0 rss: 66Mb L: 84/84 MS: 1 ShuffleBytes- 00:08:41.222 [2024-12-16 10:55:39.720666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.222 [2024-12-16 10:55:39.720696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.720792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.222 [2024-12-16 10:55:39.720814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.720936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.222 [2024-12-16 10:55:39.720960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.721076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.222 [2024-12-16 10:55:39.721093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.222 #12 NEW cov: 11832 ft: 13507 corp: 8/537b lim: 100 exec/s: 0 rss: 66Mb L: 85/85 MS: 1 CrossOver- 00:08:41.222 [2024-12-16 10:55:39.760796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.222 [2024-12-16 10:55:39.760830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.760928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.222 [2024-12-16 10:55:39.760949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.761064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.222 [2024-12-16 10:55:39.761085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.761200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.222 [2024-12-16 10:55:39.761221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.222 #13 NEW cov: 11832 ft: 13540 corp: 9/620b lim: 100 exec/s: 0 rss: 66Mb L: 83/85 MS: 1 ShuffleBytes- 00:08:41.222 [2024-12-16 10:55:39.800495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.222 [2024-12-16 10:55:39.800529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.800624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.222 [2024-12-16 10:55:39.800644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.800775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.222 [2024-12-16 10:55:39.800796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.222 [2024-12-16 10:55:39.800913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.222 [2024-12-16 10:55:39.800932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.222 #14 NEW cov: 11832 ft: 13652 corp: 10/713b lim: 100 exec/s: 0 rss: 66Mb L: 93/93 MS: 1 CMP- DE: "\336\247\012\002\000\000\000\000"- 00:08:41.222 [2024-12-16 10:55:39.840460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.222 [2024-12-16 10:55:39.840491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 #15 NEW cov: 11832 ft: 13702 corp: 11/747b lim: 100 exec/s: 0 rss: 66Mb L: 34/93 MS: 1 ChangeBinInt- 00:08:41.483 [2024-12-16 10:55:39.881150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.483 [2024-12-16 10:55:39.881180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.881296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.483 [2024-12-16 10:55:39.881316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.881424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.483 [2024-12-16 10:55:39.881445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.881561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.483 [2024-12-16 10:55:39.881580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.483 #16 NEW cov: 11832 ft: 13746 corp: 12/830b lim: 100 exec/s: 0 rss: 66Mb L: 83/93 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:41.483 [2024-12-16 10:55:39.931308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.483 [2024-12-16 10:55:39.931337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.931448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.483 [2024-12-16 10:55:39.931469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.931580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.483 [2024-12-16 10:55:39.931602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.931725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.483 [2024-12-16 10:55:39.931745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.483 #17 NEW cov: 11832 ft: 13825 corp: 13/915b lim: 100 exec/s: 0 rss: 66Mb L: 85/93 MS: 1 InsertByte- 00:08:41.483 [2024-12-16 10:55:39.971410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.483 [2024-12-16 10:55:39.971443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.971529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.483 [2024-12-16 10:55:39.971554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.971676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.483 [2024-12-16 10:55:39.971699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:39.971819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.483 [2024-12-16 10:55:39.971841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.483 #23 NEW cov: 11832 ft: 13846 corp: 14/999b lim: 100 exec/s: 0 rss: 66Mb L: 84/93 MS: 1 InsertByte- 00:08:41.483 [2024-12-16 10:55:40.010699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.483 [2024-12-16 10:55:40.010729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.010836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.483 [2024-12-16 10:55:40.010860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.483 #24 NEW cov: 11832 ft: 14136 corp: 15/1043b lim: 100 exec/s: 0 rss: 66Mb L: 44/93 MS: 1 EraseBytes- 00:08:41.483 [2024-12-16 10:55:40.051542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.483 [2024-12-16 10:55:40.051574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.051655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.483 [2024-12-16 10:55:40.051676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.051789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.483 [2024-12-16 10:55:40.051811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.051922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.483 [2024-12-16 10:55:40.051942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.483 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:41.483 #25 NEW cov: 11855 ft: 14185 corp: 16/1128b lim: 100 exec/s: 0 rss: 67Mb L: 85/93 MS: 1 ChangeByte- 00:08:41.483 [2024-12-16 10:55:40.101764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.483 [2024-12-16 10:55:40.101800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.101881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.483 [2024-12-16 10:55:40.101901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.102016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.483 [2024-12-16 10:55:40.102039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.483 [2024-12-16 10:55:40.102159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.483 [2024-12-16 10:55:40.102179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.742 #26 NEW cov: 11855 ft: 14219 corp: 17/1211b lim: 100 exec/s: 0 rss: 67Mb L: 83/93 MS: 1 ChangeByte- 00:08:41.742 [2024-12-16 10:55:40.141711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.742 [2024-12-16 10:55:40.141742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.141807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.742 [2024-12-16 10:55:40.141827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.141936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.742 [2024-12-16 10:55:40.141956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.742 #27 NEW cov: 11855 ft: 14491 corp: 18/1276b lim: 100 exec/s: 27 rss: 67Mb L: 65/93 MS: 1 EraseBytes- 00:08:41.742 [2024-12-16 10:55:40.182026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.742 [2024-12-16 10:55:40.182054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.182126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.742 [2024-12-16 10:55:40.182148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.182262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.742 [2024-12-16 10:55:40.182280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.182396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.742 [2024-12-16 10:55:40.182412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.742 #28 NEW cov: 11855 ft: 14509 corp: 19/1359b lim: 100 exec/s: 28 rss: 67Mb L: 83/93 MS: 1 ChangeASCIIInt- 00:08:41.742 [2024-12-16 10:55:40.222169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.742 [2024-12-16 10:55:40.222200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.222301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.742 [2024-12-16 10:55:40.222325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.222443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.742 [2024-12-16 10:55:40.222459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.742 [2024-12-16 10:55:40.222573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.742 [2024-12-16 10:55:40.222593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.742 #29 NEW cov: 11855 ft: 14533 corp: 20/1452b lim: 100 exec/s: 29 rss: 67Mb L: 93/93 MS: 1 CMP- DE: "*u\033\0332S\005\000"- 00:08:41.742 [2024-12-16 10:55:40.262281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.743 [2024-12-16 10:55:40.262308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.262388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.743 [2024-12-16 10:55:40.262409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.262523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.743 [2024-12-16 10:55:40.262543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.262668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.743 [2024-12-16 10:55:40.262685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.743 #30 NEW cov: 11855 ft: 14552 corp: 21/1540b lim: 100 exec/s: 30 rss: 67Mb L: 88/93 MS: 1 InsertRepeatedBytes- 00:08:41.743 [2024-12-16 10:55:40.302034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.743 [2024-12-16 10:55:40.302069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.302152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.743 [2024-12-16 10:55:40.302172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.302289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.743 [2024-12-16 10:55:40.302311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.302430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.743 [2024-12-16 10:55:40.302453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.743 #31 NEW cov: 11855 ft: 14567 corp: 22/1623b lim: 100 exec/s: 31 rss: 67Mb L: 83/93 MS: 1 ChangeBit- 00:08:41.743 [2024-12-16 10:55:40.352225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.743 [2024-12-16 10:55:40.352256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.743 [2024-12-16 10:55:40.352375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.743 [2024-12-16 10:55:40.352396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 #32 NEW cov: 11855 ft: 14590 corp: 23/1675b lim: 100 exec/s: 32 rss: 67Mb L: 52/93 MS: 1 EraseBytes- 00:08:42.002 [2024-12-16 10:55:40.402875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.002 [2024-12-16 10:55:40.402906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-12-16 10:55:40.403019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.002 [2024-12-16 10:55:40.403038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 [2024-12-16 10:55:40.403157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.002 [2024-12-16 10:55:40.403181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.002 [2024-12-16 10:55:40.403293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.002 [2024-12-16 10:55:40.403310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.002 #33 NEW cov: 11855 ft: 14621 corp: 24/1769b lim: 100 exec/s: 33 rss: 67Mb L: 94/94 MS: 1 InsertByte- 00:08:42.002 [2024-12-16 10:55:40.442501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.002 [2024-12-16 10:55:40.442532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-12-16 10:55:40.442629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.002 [2024-12-16 10:55:40.442666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 #34 NEW cov: 11855 ft: 14648 corp: 25/1821b lim: 100 exec/s: 34 rss: 67Mb L: 52/94 MS: 1 ChangeBinInt- 00:08:42.002 [2024-12-16 10:55:40.483034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.002 [2024-12-16 10:55:40.483067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.002 [2024-12-16 10:55:40.483144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.002 [2024-12-16 10:55:40.483167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.002 [2024-12-16 10:55:40.483288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.002 [2024-12-16 10:55:40.483311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.483427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.003 [2024-12-16 10:55:40.483450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.003 #35 NEW cov: 11855 ft: 14704 corp: 26/1910b lim: 100 exec/s: 35 rss: 67Mb L: 89/94 MS: 1 CopyPart- 00:08:42.003 [2024-12-16 10:55:40.522981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.003 [2024-12-16 10:55:40.523010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.523107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.003 [2024-12-16 10:55:40.523128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.523245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.003 [2024-12-16 10:55:40.523266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.523382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.003 [2024-12-16 10:55:40.523404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.003 #36 NEW cov: 11855 ft: 14717 corp: 27/1993b lim: 100 exec/s: 36 rss: 67Mb L: 83/94 MS: 1 CopyPart- 00:08:42.003 [2024-12-16 10:55:40.563225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.003 [2024-12-16 10:55:40.563256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.563343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.003 [2024-12-16 10:55:40.563361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.563479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.003 [2024-12-16 10:55:40.563499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.563614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.003 [2024-12-16 10:55:40.563638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.003 #37 NEW cov: 11855 ft: 14759 corp: 28/2076b lim: 100 exec/s: 37 rss: 67Mb L: 83/94 MS: 1 CopyPart- 00:08:42.003 [2024-12-16 10:55:40.613144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.003 [2024-12-16 10:55:40.613175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.613297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.003 [2024-12-16 10:55:40.613326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.003 [2024-12-16 10:55:40.613447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.003 [2024-12-16 10:55:40.613467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.262 #38 NEW cov: 11855 ft: 14777 corp: 29/2139b lim: 100 exec/s: 38 rss: 67Mb L: 63/94 MS: 1 EraseBytes- 00:08:42.262 [2024-12-16 10:55:40.662881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.262 [2024-12-16 10:55:40.662911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.262 #39 NEW cov: 11855 ft: 14781 corp: 30/2168b lim: 100 exec/s: 39 rss: 67Mb L: 29/94 MS: 1 EraseBytes- 00:08:42.262 [2024-12-16 10:55:40.703064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.262 [2024-12-16 10:55:40.703102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.262 #40 NEW cov: 11855 ft: 14795 corp: 31/2205b lim: 100 exec/s: 40 rss: 67Mb L: 37/94 MS: 1 CrossOver- 00:08:42.262 [2024-12-16 10:55:40.743421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.262 [2024-12-16 10:55:40.743452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.262 [2024-12-16 10:55:40.743562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.262 [2024-12-16 10:55:40.743583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.262 [2024-12-16 10:55:40.743708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.262 [2024-12-16 10:55:40.743728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.262 #41 NEW cov: 11855 ft: 14814 corp: 32/2271b lim: 100 exec/s: 41 rss: 67Mb L: 66/94 MS: 1 EraseBytes- 00:08:42.262 [2024-12-16 10:55:40.793665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.262 [2024-12-16 10:55:40.793702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.793770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.263 [2024-12-16 10:55:40.793793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.793911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.263 [2024-12-16 10:55:40.793933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.263 #42 NEW cov: 11855 ft: 14817 corp: 33/2335b lim: 100 exec/s: 42 rss: 67Mb L: 64/94 MS: 1 InsertByte- 00:08:42.263 [2024-12-16 10:55:40.833959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.263 [2024-12-16 10:55:40.833987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.834074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.263 [2024-12-16 10:55:40.834095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.834205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.263 [2024-12-16 10:55:40.834226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.834342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.263 [2024-12-16 10:55:40.834363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.263 #43 NEW cov: 11855 ft: 14821 corp: 34/2420b lim: 100 exec/s: 43 rss: 67Mb L: 85/94 MS: 1 ChangeBit- 00:08:42.263 [2024-12-16 10:55:40.874048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.263 [2024-12-16 10:55:40.874078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.874156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.263 [2024-12-16 10:55:40.874177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.874287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.263 [2024-12-16 10:55:40.874304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.263 [2024-12-16 10:55:40.874423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.263 [2024-12-16 10:55:40.874443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.522 #44 NEW cov: 11855 ft: 14825 corp: 35/2504b lim: 100 exec/s: 44 rss: 68Mb L: 84/94 MS: 1 ChangeBinInt- 00:08:42.522 [2024-12-16 10:55:40.914174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.522 [2024-12-16 10:55:40.914202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.522 [2024-12-16 10:55:40.914278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.522 [2024-12-16 10:55:40.914297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.522 [2024-12-16 10:55:40.914416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.523 [2024-12-16 10:55:40.914433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.914556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.523 [2024-12-16 10:55:40.914578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #45 NEW cov: 11855 ft: 14827 corp: 36/2595b lim: 100 exec/s: 45 rss: 68Mb L: 91/94 MS: 1 PersAutoDict- DE: "*u\033\0332S\005\000"- 00:08:42.523 [2024-12-16 10:55:40.954343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.523 [2024-12-16 10:55:40.954374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.954465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.523 [2024-12-16 10:55:40.954488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.954603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.523 [2024-12-16 10:55:40.954625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.954745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.523 [2024-12-16 10:55:40.954769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #46 NEW cov: 11855 ft: 14844 corp: 37/2679b lim: 100 exec/s: 46 rss: 68Mb L: 84/94 MS: 1 CopyPart- 00:08:42.523 [2024-12-16 10:55:40.994431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.523 [2024-12-16 10:55:40.994459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.994559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.523 [2024-12-16 10:55:40.994580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.994691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.523 [2024-12-16 10:55:40.994712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:40.994829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.523 [2024-12-16 10:55:40.994850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #47 NEW cov: 11855 ft: 14878 corp: 38/2763b lim: 100 exec/s: 47 rss: 68Mb L: 84/94 MS: 1 CrossOver- 00:08:42.523 [2024-12-16 10:55:41.034716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.523 [2024-12-16 10:55:41.034749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.034847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.523 [2024-12-16 10:55:41.034868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.034984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.523 [2024-12-16 10:55:41.035006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.035124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.523 [2024-12-16 10:55:41.035142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #48 NEW cov: 11855 ft: 14885 corp: 39/2847b lim: 100 exec/s: 48 rss: 68Mb L: 84/94 MS: 1 ChangeBinInt- 00:08:42.523 [2024-12-16 10:55:41.074772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.523 [2024-12-16 10:55:41.074802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.074877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.523 [2024-12-16 10:55:41.074899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.075010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.523 [2024-12-16 10:55:41.075033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.075148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.523 [2024-12-16 10:55:41.075173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #49 NEW cov: 11855 ft: 14890 corp: 40/2931b lim: 100 exec/s: 49 rss: 68Mb L: 84/94 MS: 1 ChangeBinInt- 00:08:42.523 [2024-12-16 10:55:41.114944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.523 [2024-12-16 10:55:41.114977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.115057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.523 [2024-12-16 10:55:41.115082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.115190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.523 [2024-12-16 10:55:41.115212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.523 [2024-12-16 10:55:41.115331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.523 [2024-12-16 10:55:41.115353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.523 #50 NEW cov: 11855 ft: 14896 corp: 41/3019b lim: 100 exec/s: 25 rss: 68Mb L: 88/94 MS: 1 ChangeBinInt- 00:08:42.523 #50 DONE cov: 11855 ft: 14896 corp: 41/3019b lim: 100 exec/s: 25 rss: 68Mb 00:08:42.523 ###### Recommended dictionary. ###### 00:08:42.523 "\336\247\012\002\000\000\000\000" # Uses: 0 00:08:42.523 "\002\000\000\000\000\000\000\000" # Uses: 0 00:08:42.523 "*u\033\0332S\005\000" # Uses: 1 00:08:42.523 ###### End of recommended dictionary. ###### 00:08:42.523 Done 50 runs in 2 second(s) 00:08:42.783 10:55:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:42.783 10:55:41 -- ../common.sh@72 -- # (( i++ )) 00:08:42.783 10:55:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.783 10:55:41 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:42.783 10:55:41 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:42.783 10:55:41 -- nvmf/run.sh@24 -- # local timen=1 00:08:42.783 10:55:41 -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.783 10:55:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:42.783 10:55:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:42.783 10:55:41 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:42.783 10:55:41 -- nvmf/run.sh@29 -- # port=4419 00:08:42.783 10:55:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:42.783 10:55:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:42.783 10:55:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.783 10:55:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:42.783 [2024-12-16 10:55:41.298581] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:42.783 [2024-12-16 10:55:41.298679] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659442 ] 00:08:42.783 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.042 [2024-12-16 10:55:41.480000] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.042 [2024-12-16 10:55:41.499836] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:43.042 [2024-12-16 10:55:41.499973] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.042 [2024-12-16 10:55:41.551591] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.042 [2024-12-16 10:55:41.567909] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:43.042 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.042 INFO: Seed: 1740301656 00:08:43.042 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:43.042 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:43.042 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:43.042 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.042 #2 INITED exec/s: 0 rss: 59Mb 00:08:43.042 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.042 This may also happen if the target rejected all inputs we tried so far 00:08:43.042 [2024-12-16 10:55:41.637629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:43.042 [2024-12-16 10:55:41.637667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.042 [2024-12-16 10:55:41.637786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:43.042 [2024-12-16 10:55:41.637808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.302 NEW_FUNC[1/670]: 0x478878 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:43.302 NEW_FUNC[2/670]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.302 #31 NEW cov: 11606 ft: 11605 corp: 2/26b lim: 50 exec/s: 0 rss: 66Mb L: 25/25 MS: 4 CopyPart-CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:43.561 [2024-12-16 10:55:41.937811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962624171501 len:60910 00:08:43.561 [2024-12-16 10:55:41.937854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.561 #37 NEW cov: 11719 ft: 12500 corp: 3/37b lim: 50 exec/s: 0 rss: 66Mb L: 11/25 MS: 1 InsertRepeatedBytes- 00:08:43.561 [2024-12-16 10:55:41.978216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:43.561 [2024-12-16 10:55:41.978243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.561 #38 NEW cov: 11725 ft: 12683 corp: 4/55b lim: 50 exec/s: 0 rss: 66Mb L: 18/25 MS: 1 EraseBytes- 00:08:43.561 [2024-12-16 10:55:42.018326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:43.561 [2024-12-16 10:55:42.018351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.561 #39 NEW cov: 11810 ft: 13043 corp: 5/73b lim: 50 exec/s: 0 rss: 66Mb L: 18/25 MS: 1 CopyPart- 00:08:43.561 [2024-12-16 10:55:42.068672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1563361280 len:1 00:08:43.561 [2024-12-16 10:55:42.068703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.561 [2024-12-16 10:55:42.068794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:43.562 [2024-12-16 10:55:42.068815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.562 #48 NEW cov: 11810 ft: 13156 corp: 6/100b lim: 50 exec/s: 0 rss: 66Mb L: 27/27 MS: 4 InsertByte-ShuffleBytes-ChangeByte-CrossOver- 00:08:43.562 [2024-12-16 10:55:42.108692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962624171501 len:13038 00:08:43.562 [2024-12-16 10:55:42.108728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.562 #49 NEW cov: 11810 ft: 13261 corp: 7/112b lim: 50 exec/s: 0 rss: 66Mb L: 12/27 MS: 1 InsertByte- 00:08:43.562 [2024-12-16 10:55:42.148746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962624171501 len:13294 00:08:43.562 [2024-12-16 10:55:42.148777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.562 #50 NEW cov: 11810 ft: 13319 corp: 8/124b lim: 50 exec/s: 0 rss: 66Mb L: 12/27 MS: 1 ChangeASCIIInt- 00:08:43.821 [2024-12-16 10:55:42.188924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047972020224 len:1 00:08:43.821 [2024-12-16 10:55:42.188950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 #51 NEW cov: 11810 ft: 13430 corp: 9/142b lim: 50 exec/s: 0 rss: 66Mb L: 18/27 MS: 1 ChangeByte- 00:08:43.821 [2024-12-16 10:55:42.229436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1563361280 len:1 00:08:43.821 [2024-12-16 10:55:42.229465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 [2024-12-16 10:55:42.229570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:43.821 [2024-12-16 10:55:42.229591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.821 [2024-12-16 10:55:42.229705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2825744883384320 len:1 00:08:43.821 [2024-12-16 10:55:42.229728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.821 [2024-12-16 10:55:42.229841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:43.821 [2024-12-16 10:55:42.229861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.821 #52 NEW cov: 11810 ft: 13788 corp: 10/187b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:43.821 [2024-12-16 10:55:42.269153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:43.821 [2024-12-16 10:55:42.269178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 #53 NEW cov: 11810 ft: 13813 corp: 11/204b lim: 50 exec/s: 0 rss: 67Mb L: 17/45 MS: 1 EraseBytes- 00:08:43.821 [2024-12-16 10:55:42.309547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:43.821 [2024-12-16 10:55:42.309578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 [2024-12-16 10:55:42.309698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 00:08:43.821 [2024-12-16 10:55:42.309722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.821 [2024-12-16 10:55:42.309836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:43.821 [2024-12-16 10:55:42.309856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.821 #54 NEW cov: 11810 ft: 14061 corp: 12/243b lim: 50 exec/s: 0 rss: 67Mb L: 39/45 MS: 1 InsertRepeatedBytes- 00:08:43.821 [2024-12-16 10:55:42.349422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:43.821 [2024-12-16 10:55:42.349447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 #55 NEW cov: 11810 ft: 14081 corp: 13/261b lim: 50 exec/s: 0 rss: 67Mb L: 18/45 MS: 1 ChangeByte- 00:08:43.821 [2024-12-16 10:55:42.389517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1241513984 len:1 00:08:43.821 [2024-12-16 10:55:42.389546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.821 #57 NEW cov: 11810 ft: 14123 corp: 14/275b lim: 50 exec/s: 0 rss: 67Mb L: 14/45 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:43.821 [2024-12-16 10:55:42.429710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047972020224 len:1 00:08:43.821 [2024-12-16 10:55:42.429735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 #58 NEW cov: 11810 ft: 14151 corp: 15/292b lim: 50 exec/s: 0 rss: 67Mb L: 17/45 MS: 1 EraseBytes- 00:08:44.081 [2024-12-16 10:55:42.469882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832619941850605 len:18762 00:08:44.081 [2024-12-16 10:55:42.469911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-12-16 10:55:42.470035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5280832617179597129 len:18762 00:08:44.081 [2024-12-16 10:55:42.470058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 #63 NEW cov: 11810 ft: 14229 corp: 16/316b lim: 50 exec/s: 0 rss: 67Mb L: 24/45 MS: 5 EraseBytes-ChangeBinInt-CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:44.081 [2024-12-16 10:55:42.510005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832619941588461 len:18762 00:08:44.081 [2024-12-16 10:55:42.510038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-12-16 10:55:42.510151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5280832617179597129 len:18762 00:08:44.081 [2024-12-16 10:55:42.510172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.081 #64 NEW cov: 11833 ft: 14264 corp: 17/340b lim: 50 exec/s: 0 rss: 67Mb L: 24/45 MS: 1 ChangeBit- 00:08:44.081 [2024-12-16 10:55:42.550230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1563361280 len:1 00:08:44.081 [2024-12-16 10:55:42.550260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-12-16 10:55:42.550384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:11 00:08:44.081 [2024-12-16 10:55:42.550406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 #65 NEW cov: 11833 ft: 14305 corp: 18/361b lim: 50 exec/s: 0 rss: 67Mb L: 21/45 MS: 1 EraseBytes- 00:08:44.081 [2024-12-16 10:55:42.589990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962624171501 len:13294 00:08:44.081 [2024-12-16 10:55:42.590029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 #66 NEW cov: 11833 ft: 14317 corp: 19/374b lim: 50 exec/s: 66 rss: 67Mb L: 13/45 MS: 1 InsertByte- 00:08:44.081 [2024-12-16 10:55:42.630458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1560281088 len:1 00:08:44.081 [2024-12-16 10:55:42.630489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 [2024-12-16 10:55:42.630602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:44.081 [2024-12-16 10:55:42.630633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.081 #67 NEW cov: 11833 ft: 14377 corp: 20/396b lim: 50 exec/s: 67 rss: 67Mb L: 22/45 MS: 1 EraseBytes- 00:08:44.081 [2024-12-16 10:55:42.670491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:103 00:08:44.081 [2024-12-16 10:55:42.670518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.081 #73 NEW cov: 11833 ft: 14380 corp: 21/414b lim: 50 exec/s: 73 rss: 67Mb L: 18/45 MS: 1 CMP- DE: "f\376V\2063S\005\000"- 00:08:44.341 [2024-12-16 10:55:42.710596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:44.341 [2024-12-16 10:55:42.710627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #74 NEW cov: 11833 ft: 14394 corp: 22/429b lim: 50 exec/s: 74 rss: 67Mb L: 15/45 MS: 1 EraseBytes- 00:08:44.341 [2024-12-16 10:55:42.750758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832619941850605 len:18762 00:08:44.341 [2024-12-16 10:55:42.750788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 [2024-12-16 10:55:42.750912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18326983288461805926 len:21254 00:08:44.341 [2024-12-16 10:55:42.750934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.341 #75 NEW cov: 11833 ft: 14405 corp: 23/453b lim: 50 exec/s: 75 rss: 67Mb L: 24/45 MS: 1 PersAutoDict- DE: "f\376V\2063S\005\000"- 00:08:44.341 [2024-12-16 10:55:42.791273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:44.341 [2024-12-16 10:55:42.791303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 [2024-12-16 10:55:42.791427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:11 00:08:44.341 [2024-12-16 10:55:42.791453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.341 [2024-12-16 10:55:42.791574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:167772160 len:1 00:08:44.341 [2024-12-16 10:55:42.791592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.341 [2024-12-16 10:55:42.791719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:44.341 [2024-12-16 10:55:42.791742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.341 #76 NEW cov: 11833 ft: 14415 corp: 24/499b lim: 50 exec/s: 76 rss: 67Mb L: 46/46 MS: 1 CrossOver- 00:08:44.341 [2024-12-16 10:55:42.830888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:44.341 [2024-12-16 10:55:42.830916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #77 NEW cov: 11833 ft: 14419 corp: 25/516b lim: 50 exec/s: 77 rss: 67Mb L: 17/46 MS: 1 ShuffleBytes- 00:08:44.341 [2024-12-16 10:55:42.881025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962624171501 len:13096 00:08:44.341 [2024-12-16 10:55:42.881051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #78 NEW cov: 11833 ft: 14435 corp: 26/528b lim: 50 exec/s: 78 rss: 67Mb L: 12/46 MS: 1 ChangeByte- 00:08:44.341 [2024-12-16 10:55:42.921217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:103 00:08:44.341 [2024-12-16 10:55:42.921251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.341 #79 NEW cov: 11833 ft: 14449 corp: 27/546b lim: 50 exec/s: 79 rss: 67Mb L: 18/46 MS: 1 ShuffleBytes- 00:08:44.601 [2024-12-16 10:55:42.971376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:44.601 [2024-12-16 10:55:42.971403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 #80 NEW cov: 11833 ft: 14459 corp: 28/564b lim: 50 exec/s: 80 rss: 67Mb L: 18/46 MS: 1 ChangeBinInt- 00:08:44.601 [2024-12-16 10:55:43.011668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17077649790980713965 len:1 00:08:44.601 [2024-12-16 10:55:43.011703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 [2024-12-16 10:55:43.011818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17077649790980713778 len:1 00:08:44.601 [2024-12-16 10:55:43.011842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.601 #81 NEW cov: 11833 ft: 14473 corp: 29/590b lim: 50 exec/s: 81 rss: 67Mb L: 26/46 MS: 1 CrossOver- 00:08:44.601 [2024-12-16 10:55:43.051572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962087300589 len:13038 00:08:44.601 [2024-12-16 10:55:43.051601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 #82 NEW cov: 11833 ft: 14491 corp: 30/602b lim: 50 exec/s: 82 rss: 67Mb L: 12/46 MS: 1 ChangeBit- 00:08:44.601 [2024-12-16 10:55:43.091785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832619941850605 len:18762 00:08:44.601 [2024-12-16 10:55:43.091815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 [2024-12-16 10:55:43.091930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:389278064364308819 len:52 00:08:44.601 [2024-12-16 10:55:43.091952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.601 #83 NEW cov: 11833 ft: 14498 corp: 31/626b lim: 50 exec/s: 83 rss: 67Mb L: 24/46 MS: 1 ShuffleBytes- 00:08:44.601 [2024-12-16 10:55:43.132334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:44.601 [2024-12-16 10:55:43.132364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 [2024-12-16 10:55:43.132486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3698305218287392390 len:1 00:08:44.601 [2024-12-16 10:55:43.132508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.601 [2024-12-16 10:55:43.132625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65536 00:08:44.601 [2024-12-16 10:55:43.132648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.601 [2024-12-16 10:55:43.132757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:44.601 [2024-12-16 10:55:43.132781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.601 #84 NEW cov: 11833 ft: 14515 corp: 32/673b lim: 50 exec/s: 84 rss: 68Mb L: 47/47 MS: 1 PersAutoDict- DE: "f\376V\2063S\005\000"- 00:08:44.601 [2024-12-16 10:55:43.181929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1047972020224 len:1 00:08:44.601 [2024-12-16 10:55:43.181959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.601 #85 NEW cov: 11833 ft: 14539 corp: 33/691b lim: 50 exec/s: 85 rss: 68Mb L: 18/47 MS: 1 CrossOver- 00:08:44.601 [2024-12-16 10:55:43.222093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144480225135816173 len:13096 00:08:44.601 [2024-12-16 10:55:43.222124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 #86 NEW cov: 11833 ft: 14551 corp: 34/703b lim: 50 exec/s: 86 rss: 68Mb L: 12/47 MS: 1 ChangeBit- 00:08:44.861 [2024-12-16 10:55:43.262296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832619941850605 len:18842 00:08:44.861 [2024-12-16 10:55:43.262326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.262435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5280832618527037849 len:21254 00:08:44.861 [2024-12-16 10:55:43.262462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 #87 NEW cov: 11833 ft: 14561 corp: 35/732b lim: 50 exec/s: 87 rss: 68Mb L: 29/47 MS: 1 InsertRepeatedBytes- 00:08:44.861 [2024-12-16 10:55:43.302837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1562836992 len:1 00:08:44.861 [2024-12-16 10:55:43.302869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.302977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:44.861 [2024-12-16 10:55:43.302995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.303110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2825744883384320 len:1 00:08:44.861 [2024-12-16 10:55:43.303135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.303268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:44.861 [2024-12-16 10:55:43.303292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.861 #88 NEW cov: 11833 ft: 14609 corp: 36/777b lim: 50 exec/s: 88 rss: 68Mb L: 45/47 MS: 1 ChangeBinInt- 00:08:44.861 [2024-12-16 10:55:43.352722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1560281088 len:1 00:08:44.861 [2024-12-16 10:55:43.352751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.352866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:44.861 [2024-12-16 10:55:43.352891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.353010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:44.861 [2024-12-16 10:55:43.353035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.861 #89 NEW cov: 11833 ft: 14631 corp: 37/814b lim: 50 exec/s: 89 rss: 68Mb L: 37/47 MS: 1 CopyPart- 00:08:44.861 [2024-12-16 10:55:43.392762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832619941850605 len:18762 00:08:44.861 [2024-12-16 10:55:43.392796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 [2024-12-16 10:55:43.392919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:389278785918814547 len:22017 00:08:44.861 [2024-12-16 10:55:43.392943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.861 #90 NEW cov: 11833 ft: 14636 corp: 38/839b lim: 50 exec/s: 90 rss: 68Mb L: 25/47 MS: 1 CopyPart- 00:08:44.861 [2024-12-16 10:55:43.432659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144620962624171501 len:13294 00:08:44.861 [2024-12-16 10:55:43.432684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.861 #91 NEW cov: 11833 ft: 14643 corp: 39/852b lim: 50 exec/s: 91 rss: 68Mb L: 13/47 MS: 1 CMP- DE: "\016\000"- 00:08:44.862 [2024-12-16 10:55:43.472843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:44.862 [2024-12-16 10:55:43.472871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 #92 NEW cov: 11833 ft: 14665 corp: 40/870b lim: 50 exec/s: 92 rss: 68Mb L: 18/47 MS: 1 CopyPart- 00:08:45.121 [2024-12-16 10:55:43.512997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17144425249554427373 len:13294 00:08:45.121 [2024-12-16 10:55:43.513024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 #93 NEW cov: 11833 ft: 14682 corp: 41/883b lim: 50 exec/s: 93 rss: 68Mb L: 13/47 MS: 1 ChangeByte- 00:08:45.121 [2024-12-16 10:55:43.553520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:45.121 [2024-12-16 10:55:43.553553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.121 [2024-12-16 10:55:43.553660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:11 00:08:45.121 [2024-12-16 10:55:43.553679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.121 [2024-12-16 10:55:43.553783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:167772160 len:1 00:08:45.121 [2024-12-16 10:55:43.553805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.121 [2024-12-16 10:55:43.553912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:45.121 [2024-12-16 10:55:43.553934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.121 #94 NEW cov: 11833 ft: 14688 corp: 42/929b lim: 50 exec/s: 94 rss: 68Mb L: 46/47 MS: 1 ChangeBit- 00:08:45.121 [2024-12-16 10:55:43.593218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:103 00:08:45.122 [2024-12-16 10:55:43.593244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.122 #95 NEW cov: 11833 ft: 14692 corp: 43/947b lim: 50 exec/s: 47 rss: 69Mb L: 18/47 MS: 1 PersAutoDict- DE: "f\376V\2063S\005\000"- 00:08:45.122 #95 DONE cov: 11833 ft: 14692 corp: 43/947b lim: 50 exec/s: 47 rss: 69Mb 00:08:45.122 ###### Recommended dictionary. ###### 00:08:45.122 "f\376V\2063S\005\000" # Uses: 3 00:08:45.122 "\016\000" # Uses: 0 00:08:45.122 ###### End of recommended dictionary. ###### 00:08:45.122 Done 95 runs in 2 second(s) 00:08:45.122 10:55:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:45.122 10:55:43 -- ../common.sh@72 -- # (( i++ )) 00:08:45.122 10:55:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.122 10:55:43 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:45.122 10:55:43 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:45.122 10:55:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:45.122 10:55:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.122 10:55:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:45.122 10:55:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:45.122 10:55:43 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:45.122 10:55:43 -- nvmf/run.sh@29 -- # port=4420 00:08:45.122 10:55:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:45.122 10:55:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:45.122 10:55:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.381 10:55:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:45.381 [2024-12-16 10:55:43.773477] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:45.381 [2024-12-16 10:55:43.773552] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid659899 ] 00:08:45.381 EAL: No free 2048 kB hugepages reported on node 1 00:08:45.381 [2024-12-16 10:55:43.951184] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.381 [2024-12-16 10:55:43.970094] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:45.381 [2024-12-16 10:55:43.970232] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.640 [2024-12-16 10:55:44.021472] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.640 [2024-12-16 10:55:44.037771] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:45.640 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.640 INFO: Seed: 4212303142 00:08:45.640 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:45.640 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:45.640 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:45.640 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.640 #2 INITED exec/s: 0 rss: 59Mb 00:08:45.640 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.640 This may also happen if the target rejected all inputs we tried so far 00:08:45.640 [2024-12-16 10:55:44.092978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.640 [2024-12-16 10:55:44.093008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.640 [2024-12-16 10:55:44.093061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.640 [2024-12-16 10:55:44.093077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.900 NEW_FUNC[1/672]: 0x47a438 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:45.900 NEW_FUNC[2/672]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.900 #6 NEW cov: 11664 ft: 11659 corp: 2/49b lim: 90 exec/s: 0 rss: 66Mb L: 48/48 MS: 4 ChangeByte-InsertRepeatedBytes-EraseBytes-InsertRepeatedBytes- 00:08:45.900 [2024-12-16 10:55:44.394017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.900 [2024-12-16 10:55:44.394051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.900 [2024-12-16 10:55:44.394109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.900 [2024-12-16 10:55:44.394123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.900 [2024-12-16 10:55:44.394180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.900 [2024-12-16 10:55:44.394195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.900 #12 NEW cov: 11777 ft: 12502 corp: 3/119b lim: 90 exec/s: 0 rss: 66Mb L: 70/70 MS: 1 CopyPart- 00:08:45.900 [2024-12-16 10:55:44.443877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.900 [2024-12-16 10:55:44.443905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.900 [2024-12-16 10:55:44.443967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.900 [2024-12-16 10:55:44.443986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.900 #13 NEW cov: 11783 ft: 12888 corp: 4/160b lim: 90 exec/s: 0 rss: 66Mb L: 41/70 MS: 1 InsertRepeatedBytes- 00:08:45.900 [2024-12-16 10:55:44.483948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.900 [2024-12-16 10:55:44.483979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.900 [2024-12-16 10:55:44.484032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.900 [2024-12-16 10:55:44.484048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.900 #14 NEW cov: 11868 ft: 13151 corp: 5/209b lim: 90 exec/s: 0 rss: 66Mb L: 49/70 MS: 1 CopyPart- 00:08:46.159 [2024-12-16 10:55:44.523946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.159 [2024-12-16 10:55:44.523975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.159 #15 NEW cov: 11868 ft: 14020 corp: 6/236b lim: 90 exec/s: 0 rss: 66Mb L: 27/70 MS: 1 EraseBytes- 00:08:46.159 [2024-12-16 10:55:44.564501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.159 [2024-12-16 10:55:44.564529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.159 [2024-12-16 10:55:44.564570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.159 [2024-12-16 10:55:44.564584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.159 [2024-12-16 10:55:44.564645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.159 [2024-12-16 10:55:44.564660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.159 [2024-12-16 10:55:44.564717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.159 [2024-12-16 10:55:44.564733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.160 #16 NEW cov: 11868 ft: 14453 corp: 7/316b lim: 90 exec/s: 0 rss: 66Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:08:46.160 [2024-12-16 10:55:44.604173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.160 [2024-12-16 10:55:44.604200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.160 #17 NEW cov: 11868 ft: 14566 corp: 8/344b lim: 90 exec/s: 0 rss: 66Mb L: 28/80 MS: 1 EraseBytes- 00:08:46.160 [2024-12-16 10:55:44.644261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.160 [2024-12-16 10:55:44.644289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.160 #18 NEW cov: 11868 ft: 14624 corp: 9/372b lim: 90 exec/s: 0 rss: 66Mb L: 28/80 MS: 1 ChangeBit- 00:08:46.160 [2024-12-16 10:55:44.684726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.160 [2024-12-16 10:55:44.684753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.160 [2024-12-16 10:55:44.684795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.160 [2024-12-16 10:55:44.684811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.160 [2024-12-16 10:55:44.684870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.160 [2024-12-16 10:55:44.684885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.160 #21 NEW cov: 11868 ft: 14646 corp: 10/433b lim: 90 exec/s: 0 rss: 66Mb L: 61/80 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:46.160 [2024-12-16 10:55:44.724506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.160 [2024-12-16 10:55:44.724533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.160 #22 NEW cov: 11868 ft: 14677 corp: 11/461b lim: 90 exec/s: 0 rss: 66Mb L: 28/80 MS: 1 ShuffleBytes- 00:08:46.160 [2024-12-16 10:55:44.764635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.160 [2024-12-16 10:55:44.764663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.419 #27 NEW cov: 11868 ft: 14728 corp: 12/484b lim: 90 exec/s: 0 rss: 66Mb L: 23/80 MS: 5 ChangeBinInt-InsertRepeatedBytes-ChangeBinInt-CMP-CMP- DE: "\274re\2754S\005\000"-"\000\000\000\000\000\000\000\004"- 00:08:46.419 [2024-12-16 10:55:44.804746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.419 [2024-12-16 10:55:44.804773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.419 #28 NEW cov: 11868 ft: 14815 corp: 13/512b lim: 90 exec/s: 0 rss: 66Mb L: 28/80 MS: 1 CopyPart- 00:08:46.420 [2024-12-16 10:55:44.845196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.420 [2024-12-16 10:55:44.845223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:44.845277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.420 [2024-12-16 10:55:44.845293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:44.845353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.420 [2024-12-16 10:55:44.845369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.420 #29 NEW cov: 11868 ft: 14858 corp: 14/582b lim: 90 exec/s: 0 rss: 67Mb L: 70/80 MS: 1 CopyPart- 00:08:46.420 [2024-12-16 10:55:44.894984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.420 [2024-12-16 10:55:44.895011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.420 #35 NEW cov: 11868 ft: 14925 corp: 15/610b lim: 90 exec/s: 0 rss: 67Mb L: 28/80 MS: 1 CopyPart- 00:08:46.420 [2024-12-16 10:55:44.935146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.420 [2024-12-16 10:55:44.935174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.420 #36 NEW cov: 11868 ft: 15000 corp: 16/638b lim: 90 exec/s: 0 rss: 67Mb L: 28/80 MS: 1 ShuffleBytes- 00:08:46.420 [2024-12-16 10:55:44.975711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.420 [2024-12-16 10:55:44.975738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:44.975789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.420 [2024-12-16 10:55:44.975805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:44.975862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.420 [2024-12-16 10:55:44.975877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:44.975934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.420 [2024-12-16 10:55:44.975949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.420 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.420 #37 NEW cov: 11891 ft: 15031 corp: 17/710b lim: 90 exec/s: 0 rss: 67Mb L: 72/80 MS: 1 CrossOver- 00:08:46.420 [2024-12-16 10:55:45.025896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.420 [2024-12-16 10:55:45.025923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:45.025972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.420 [2024-12-16 10:55:45.025989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:45.026044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.420 [2024-12-16 10:55:45.026060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.420 [2024-12-16 10:55:45.026116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.420 [2024-12-16 10:55:45.026132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.689 #38 NEW cov: 11891 ft: 15045 corp: 18/791b lim: 90 exec/s: 0 rss: 67Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:46.689 [2024-12-16 10:55:45.065997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.066024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.066088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.689 [2024-12-16 10:55:45.066105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.066162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.689 [2024-12-16 10:55:45.066178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.066233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.689 [2024-12-16 10:55:45.066249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.689 #39 NEW cov: 11891 ft: 15060 corp: 19/872b lim: 90 exec/s: 39 rss: 67Mb L: 81/81 MS: 1 ChangeBit- 00:08:46.689 [2024-12-16 10:55:45.105702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.105729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 #40 NEW cov: 11891 ft: 15071 corp: 20/895b lim: 90 exec/s: 40 rss: 67Mb L: 23/81 MS: 1 ChangeASCIIInt- 00:08:46.689 [2024-12-16 10:55:45.146266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.146294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.146331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.689 [2024-12-16 10:55:45.146349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.146406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.689 [2024-12-16 10:55:45.146438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.146497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.689 [2024-12-16 10:55:45.146513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.689 #41 NEW cov: 11891 ft: 15151 corp: 21/967b lim: 90 exec/s: 41 rss: 67Mb L: 72/81 MS: 1 ChangeBinInt- 00:08:46.689 [2024-12-16 10:55:45.185927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.185953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 #44 NEW cov: 11891 ft: 15170 corp: 22/987b lim: 90 exec/s: 44 rss: 67Mb L: 20/81 MS: 3 InsertByte-CrossOver-CMP- DE: "\003i7\3754S\005\000"- 00:08:46.689 [2024-12-16 10:55:45.226001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.226028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 #45 NEW cov: 11891 ft: 15206 corp: 23/1015b lim: 90 exec/s: 45 rss: 67Mb L: 28/81 MS: 1 ChangeBinInt- 00:08:46.689 [2024-12-16 10:55:45.266454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.266481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.266522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.689 [2024-12-16 10:55:45.266538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.266594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.689 [2024-12-16 10:55:45.266613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.689 #46 NEW cov: 11891 ft: 15229 corp: 24/1070b lim: 90 exec/s: 46 rss: 67Mb L: 55/81 MS: 1 InsertRepeatedBytes- 00:08:46.689 [2024-12-16 10:55:45.306429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-12-16 10:55:45.306455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 [2024-12-16 10:55:45.306524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.689 [2024-12-16 10:55:45.306540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.949 #47 NEW cov: 11891 ft: 15251 corp: 25/1111b lim: 90 exec/s: 47 rss: 68Mb L: 41/81 MS: 1 EraseBytes- 00:08:46.949 [2024-12-16 10:55:45.346536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.949 [2024-12-16 10:55:45.346563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.346619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.949 [2024-12-16 10:55:45.346651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.949 #48 NEW cov: 11891 ft: 15274 corp: 26/1160b lim: 90 exec/s: 48 rss: 68Mb L: 49/81 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\004"- 00:08:46.949 [2024-12-16 10:55:45.386526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.949 [2024-12-16 10:55:45.386553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.949 #49 NEW cov: 11891 ft: 15289 corp: 27/1183b lim: 90 exec/s: 49 rss: 68Mb L: 23/81 MS: 1 ShuffleBytes- 00:08:46.949 [2024-12-16 10:55:45.426941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.949 [2024-12-16 10:55:45.426968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.427020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.949 [2024-12-16 10:55:45.427036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.427091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.949 [2024-12-16 10:55:45.427106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.949 #50 NEW cov: 11891 ft: 15324 corp: 28/1245b lim: 90 exec/s: 50 rss: 68Mb L: 62/81 MS: 1 CrossOver- 00:08:46.949 [2024-12-16 10:55:45.467219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.949 [2024-12-16 10:55:45.467247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.467290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.949 [2024-12-16 10:55:45.467306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.467362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.949 [2024-12-16 10:55:45.467377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.467434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.949 [2024-12-16 10:55:45.467449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.949 #51 NEW cov: 11891 ft: 15336 corp: 29/1333b lim: 90 exec/s: 51 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:46.949 [2024-12-16 10:55:45.507352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.949 [2024-12-16 10:55:45.507379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.507419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.949 [2024-12-16 10:55:45.507435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.507493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.949 [2024-12-16 10:55:45.507509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.507567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.949 [2024-12-16 10:55:45.507583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.949 #52 NEW cov: 11891 ft: 15342 corp: 30/1421b lim: 90 exec/s: 52 rss: 68Mb L: 88/88 MS: 1 ShuffleBytes- 00:08:46.949 [2024-12-16 10:55:45.547484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.949 [2024-12-16 10:55:45.547512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.547560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.949 [2024-12-16 10:55:45.547576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.547635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:46.949 [2024-12-16 10:55:45.547665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.949 [2024-12-16 10:55:45.547720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:46.950 [2024-12-16 10:55:45.547735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.950 #53 NEW cov: 11891 ft: 15357 corp: 31/1503b lim: 90 exec/s: 53 rss: 68Mb L: 82/88 MS: 1 InsertByte- 00:08:47.209 [2024-12-16 10:55:45.587537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.587564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.587620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.210 [2024-12-16 10:55:45.587637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.587694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.210 [2024-12-16 10:55:45.587709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.587766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.210 [2024-12-16 10:55:45.587782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.210 #54 NEW cov: 11891 ft: 15374 corp: 32/1586b lim: 90 exec/s: 54 rss: 68Mb L: 83/88 MS: 1 InsertRepeatedBytes- 00:08:47.210 [2024-12-16 10:55:45.627396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.627424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.627483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.210 [2024-12-16 10:55:45.627500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.210 #55 NEW cov: 11891 ft: 15384 corp: 33/1634b lim: 90 exec/s: 55 rss: 68Mb L: 48/88 MS: 1 EraseBytes- 00:08:47.210 [2024-12-16 10:55:45.667814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.667842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.667890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.210 [2024-12-16 10:55:45.667906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.667963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.210 [2024-12-16 10:55:45.667978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.668037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.210 [2024-12-16 10:55:45.668052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.210 #56 NEW cov: 11891 ft: 15393 corp: 34/1722b lim: 90 exec/s: 56 rss: 68Mb L: 88/88 MS: 1 ChangeByte- 00:08:47.210 [2024-12-16 10:55:45.707950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.707979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.708028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.210 [2024-12-16 10:55:45.708044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.708117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.210 [2024-12-16 10:55:45.708134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.708192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.210 [2024-12-16 10:55:45.708210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.210 #57 NEW cov: 11891 ft: 15411 corp: 35/1806b lim: 90 exec/s: 57 rss: 68Mb L: 84/88 MS: 1 InsertByte- 00:08:47.210 [2024-12-16 10:55:45.747585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.747618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 #58 NEW cov: 11891 ft: 15455 corp: 36/1834b lim: 90 exec/s: 58 rss: 68Mb L: 28/88 MS: 1 ChangeBinInt- 00:08:47.210 [2024-12-16 10:55:45.787873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.787901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.787973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.210 [2024-12-16 10:55:45.787989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.210 #59 NEW cov: 11891 ft: 15460 corp: 37/1875b lim: 90 exec/s: 59 rss: 68Mb L: 41/88 MS: 1 ChangeBinInt- 00:08:47.210 [2024-12-16 10:55:45.828328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.210 [2024-12-16 10:55:45.828356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.828401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.210 [2024-12-16 10:55:45.828417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.828472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.210 [2024-12-16 10:55:45.828488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.210 [2024-12-16 10:55:45.828546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.210 [2024-12-16 10:55:45.828562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.469 #60 NEW cov: 11891 ft: 15472 corp: 38/1956b lim: 90 exec/s: 60 rss: 68Mb L: 81/88 MS: 1 ChangeBit- 00:08:47.469 [2024-12-16 10:55:45.867998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.469 [2024-12-16 10:55:45.868026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.469 #61 NEW cov: 11891 ft: 15530 corp: 39/1985b lim: 90 exec/s: 61 rss: 68Mb L: 29/88 MS: 1 InsertByte- 00:08:47.469 [2024-12-16 10:55:45.908076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.469 [2024-12-16 10:55:45.908103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.469 #62 NEW cov: 11891 ft: 15539 corp: 40/2013b lim: 90 exec/s: 62 rss: 69Mb L: 28/88 MS: 1 CopyPart- 00:08:47.469 [2024-12-16 10:55:45.948205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.469 [2024-12-16 10:55:45.948232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.469 #63 NEW cov: 11891 ft: 15577 corp: 41/2041b lim: 90 exec/s: 63 rss: 69Mb L: 28/88 MS: 1 ChangeBinInt- 00:08:47.470 [2024-12-16 10:55:45.988825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.470 [2024-12-16 10:55:45.988853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.470 [2024-12-16 10:55:45.988920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.470 [2024-12-16 10:55:45.988937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.470 [2024-12-16 10:55:45.988992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.470 [2024-12-16 10:55:45.989008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.470 [2024-12-16 10:55:45.989064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.470 [2024-12-16 10:55:45.989080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.470 #64 NEW cov: 11891 ft: 15585 corp: 42/2130b lim: 90 exec/s: 64 rss: 69Mb L: 89/89 MS: 1 CopyPart- 00:08:47.470 [2024-12-16 10:55:46.028431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.470 [2024-12-16 10:55:46.028457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.470 #65 NEW cov: 11891 ft: 15596 corp: 43/2158b lim: 90 exec/s: 65 rss: 69Mb L: 28/89 MS: 1 ChangeBit- 00:08:47.470 [2024-12-16 10:55:46.069047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.470 [2024-12-16 10:55:46.069073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.470 [2024-12-16 10:55:46.069122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.470 [2024-12-16 10:55:46.069138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.470 [2024-12-16 10:55:46.069192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.470 [2024-12-16 10:55:46.069208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.470 [2024-12-16 10:55:46.069262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.470 [2024-12-16 10:55:46.069277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.729 #66 NEW cov: 11891 ft: 15608 corp: 44/2244b lim: 90 exec/s: 33 rss: 69Mb L: 86/89 MS: 1 CMP- DE: "\377\377\377\013"- 00:08:47.729 #66 DONE cov: 11891 ft: 15608 corp: 44/2244b lim: 90 exec/s: 33 rss: 69Mb 00:08:47.729 ###### Recommended dictionary. ###### 00:08:47.729 "\274re\2754S\005\000" # Uses: 0 00:08:47.729 "\000\000\000\000\000\000\000\004" # Uses: 1 00:08:47.729 "\003i7\3754S\005\000" # Uses: 0 00:08:47.729 "\377\377\377\013" # Uses: 0 00:08:47.729 ###### End of recommended dictionary. ###### 00:08:47.729 Done 66 runs in 2 second(s) 00:08:47.729 10:55:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:47.729 10:55:46 -- ../common.sh@72 -- # (( i++ )) 00:08:47.729 10:55:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.729 10:55:46 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:47.729 10:55:46 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:47.729 10:55:46 -- nvmf/run.sh@24 -- # local timen=1 00:08:47.729 10:55:46 -- nvmf/run.sh@25 -- # local core=0x1 00:08:47.729 10:55:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:47.729 10:55:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:47.729 10:55:46 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:47.729 10:55:46 -- nvmf/run.sh@29 -- # port=4421 00:08:47.729 10:55:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:47.729 10:55:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:47.729 10:55:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:47.729 10:55:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:47.729 [2024-12-16 10:55:46.253424] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:47.729 [2024-12-16 10:55:46.253515] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660439 ] 00:08:47.729 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.989 [2024-12-16 10:55:46.429051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.989 [2024-12-16 10:55:46.448395] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:47.989 [2024-12-16 10:55:46.448536] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.989 [2024-12-16 10:55:46.499898] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:47.989 [2024-12-16 10:55:46.516229] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:47.989 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.989 INFO: Seed: 2395335593 00:08:47.989 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:47.989 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:47.989 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:47.989 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.989 #2 INITED exec/s: 0 rss: 59Mb 00:08:47.989 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.989 This may also happen if the target rejected all inputs we tried so far 00:08:47.989 [2024-12-16 10:55:46.561549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.989 [2024-12-16 10:55:46.561579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.989 [2024-12-16 10:55:46.561626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.989 [2024-12-16 10:55:46.561643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.989 [2024-12-16 10:55:46.561700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.989 [2024-12-16 10:55:46.561718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.248 NEW_FUNC[1/672]: 0x47d668 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:48.249 NEW_FUNC[2/672]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.249 #7 NEW cov: 11639 ft: 11640 corp: 2/40b lim: 50 exec/s: 0 rss: 66Mb L: 39/39 MS: 5 CrossOver-ChangeBit-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:08:48.249 [2024-12-16 10:55:46.862368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.249 [2024-12-16 10:55:46.862401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.249 [2024-12-16 10:55:46.862453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.249 [2024-12-16 10:55:46.862469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.249 [2024-12-16 10:55:46.862527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.249 [2024-12-16 10:55:46.862559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.508 #8 NEW cov: 11752 ft: 12091 corp: 3/79b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:48.508 [2024-12-16 10:55:46.912457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.508 [2024-12-16 10:55:46.912485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.508 [2024-12-16 10:55:46.912525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.508 [2024-12-16 10:55:46.912540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.508 [2024-12-16 10:55:46.912598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.508 [2024-12-16 10:55:46.912617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.508 #14 NEW cov: 11758 ft: 12471 corp: 4/118b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeByte- 00:08:48.508 [2024-12-16 10:55:46.952741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.508 [2024-12-16 10:55:46.952771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.508 [2024-12-16 10:55:46.952810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.508 [2024-12-16 10:55:46.952825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.508 [2024-12-16 10:55:46.952880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.508 [2024-12-16 10:55:46.952895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:46.952952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.509 [2024-12-16 10:55:46.952967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.509 #15 NEW cov: 11843 ft: 13004 corp: 5/158b lim: 50 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertByte- 00:08:48.509 [2024-12-16 10:55:46.992830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.509 [2024-12-16 10:55:46.992861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:46.992901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.509 [2024-12-16 10:55:46.992917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:46.992974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.509 [2024-12-16 10:55:46.992989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:46.993046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.509 [2024-12-16 10:55:46.993061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.509 #16 NEW cov: 11843 ft: 13186 corp: 6/203b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 CrossOver- 00:08:48.509 [2024-12-16 10:55:47.032799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.509 [2024-12-16 10:55:47.032826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.032865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.509 [2024-12-16 10:55:47.032881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.032937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.509 [2024-12-16 10:55:47.032953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.509 #17 NEW cov: 11843 ft: 13294 corp: 7/242b lim: 50 exec/s: 0 rss: 67Mb L: 39/45 MS: 1 ChangeBinInt- 00:08:48.509 [2024-12-16 10:55:47.072929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.509 [2024-12-16 10:55:47.072956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.072995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.509 [2024-12-16 10:55:47.073010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.073067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.509 [2024-12-16 10:55:47.073082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.509 #18 NEW cov: 11843 ft: 13341 corp: 8/281b lim: 50 exec/s: 0 rss: 67Mb L: 39/45 MS: 1 ChangeBit- 00:08:48.509 [2024-12-16 10:55:47.113183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.509 [2024-12-16 10:55:47.113208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.113273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.509 [2024-12-16 10:55:47.113289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.113346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.509 [2024-12-16 10:55:47.113364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.509 [2024-12-16 10:55:47.113423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.509 [2024-12-16 10:55:47.113442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.769 #19 NEW cov: 11843 ft: 13371 corp: 9/326b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:48.769 [2024-12-16 10:55:47.153338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.769 [2024-12-16 10:55:47.153366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.153417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.769 [2024-12-16 10:55:47.153433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.153491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.769 [2024-12-16 10:55:47.153506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.153563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.769 [2024-12-16 10:55:47.153579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.769 #20 NEW cov: 11843 ft: 13493 corp: 10/366b lim: 50 exec/s: 0 rss: 67Mb L: 40/45 MS: 1 CopyPart- 00:08:48.769 [2024-12-16 10:55:47.192970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.769 [2024-12-16 10:55:47.192997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.769 #21 NEW cov: 11843 ft: 14335 corp: 11/379b lim: 50 exec/s: 0 rss: 67Mb L: 13/45 MS: 1 CrossOver- 00:08:48.769 [2024-12-16 10:55:47.233428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.769 [2024-12-16 10:55:47.233455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.233505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.769 [2024-12-16 10:55:47.233519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.233577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.769 [2024-12-16 10:55:47.233593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.769 #22 NEW cov: 11843 ft: 14384 corp: 12/418b lim: 50 exec/s: 0 rss: 67Mb L: 39/45 MS: 1 CrossOver- 00:08:48.769 [2024-12-16 10:55:47.273675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.769 [2024-12-16 10:55:47.273703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.273743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.769 [2024-12-16 10:55:47.273759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.273817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.769 [2024-12-16 10:55:47.273832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.273887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.769 [2024-12-16 10:55:47.273903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.769 #23 NEW cov: 11843 ft: 14393 corp: 13/458b lim: 50 exec/s: 0 rss: 68Mb L: 40/45 MS: 1 ChangeBinInt- 00:08:48.769 [2024-12-16 10:55:47.313844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.769 [2024-12-16 10:55:47.313872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.313913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.769 [2024-12-16 10:55:47.313930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.313985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.769 [2024-12-16 10:55:47.314002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.314061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.769 [2024-12-16 10:55:47.314076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.769 #24 NEW cov: 11843 ft: 14491 corp: 14/502b lim: 50 exec/s: 0 rss: 68Mb L: 44/45 MS: 1 EraseBytes- 00:08:48.769 [2024-12-16 10:55:47.353902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.769 [2024-12-16 10:55:47.353929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.353978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.769 [2024-12-16 10:55:47.353995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.354050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.769 [2024-12-16 10:55:47.354066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.769 [2024-12-16 10:55:47.354124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.769 [2024-12-16 10:55:47.354139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.769 #25 NEW cov: 11843 ft: 14547 corp: 15/546b lim: 50 exec/s: 0 rss: 68Mb L: 44/45 MS: 1 ChangeBit- 00:08:49.029 [2024-12-16 10:55:47.393895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.029 [2024-12-16 10:55:47.393924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.029 [2024-12-16 10:55:47.393960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.029 [2024-12-16 10:55:47.393976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.029 [2024-12-16 10:55:47.394035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.029 [2024-12-16 10:55:47.394049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.029 #26 NEW cov: 11843 ft: 14584 corp: 16/585b lim: 50 exec/s: 0 rss: 68Mb L: 39/45 MS: 1 ChangeBit- 00:08:49.029 [2024-12-16 10:55:47.434151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.029 [2024-12-16 10:55:47.434177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.029 [2024-12-16 10:55:47.434227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.029 [2024-12-16 10:55:47.434249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.029 [2024-12-16 10:55:47.434306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.030 [2024-12-16 10:55:47.434321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.434378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.030 [2024-12-16 10:55:47.434394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.030 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:49.030 #27 NEW cov: 11866 ft: 14605 corp: 17/630b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 ChangeByte- 00:08:49.030 [2024-12-16 10:55:47.474095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.030 [2024-12-16 10:55:47.474123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.474164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.030 [2024-12-16 10:55:47.474180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.474239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.030 [2024-12-16 10:55:47.474256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.030 #28 NEW cov: 11866 ft: 14636 corp: 18/669b lim: 50 exec/s: 0 rss: 68Mb L: 39/45 MS: 1 ChangeBit- 00:08:49.030 [2024-12-16 10:55:47.514183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.030 [2024-12-16 10:55:47.514210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.514251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.030 [2024-12-16 10:55:47.514265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.514324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.030 [2024-12-16 10:55:47.514339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.030 #29 NEW cov: 11866 ft: 14664 corp: 19/708b lim: 50 exec/s: 0 rss: 68Mb L: 39/45 MS: 1 ChangeBinInt- 00:08:49.030 [2024-12-16 10:55:47.554510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.030 [2024-12-16 10:55:47.554538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.554581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.030 [2024-12-16 10:55:47.554596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.554662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.030 [2024-12-16 10:55:47.554678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.030 [2024-12-16 10:55:47.554733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.030 [2024-12-16 10:55:47.554749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.030 #30 NEW cov: 11866 ft: 14686 corp: 20/749b lim: 50 exec/s: 30 rss: 68Mb L: 41/45 MS: 1 InsertByte- 00:08:49.030 [2024-12-16 10:55:47.594127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.030 [2024-12-16 10:55:47.594153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.030 #31 NEW cov: 11866 ft: 14726 corp: 21/763b lim: 50 exec/s: 31 rss: 68Mb L: 14/45 MS: 1 InsertByte- 00:08:49.030 [2024-12-16 10:55:47.634251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.030 [2024-12-16 10:55:47.634279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 #32 NEW cov: 11866 ft: 14757 corp: 22/775b lim: 50 exec/s: 32 rss: 68Mb L: 12/45 MS: 1 EraseBytes- 00:08:49.290 [2024-12-16 10:55:47.674814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.290 [2024-12-16 10:55:47.674842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.674892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.290 [2024-12-16 10:55:47.674908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.674982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.290 [2024-12-16 10:55:47.674999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.675057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.290 [2024-12-16 10:55:47.675072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.290 #33 NEW cov: 11866 ft: 14761 corp: 23/815b lim: 50 exec/s: 33 rss: 68Mb L: 40/45 MS: 1 CopyPart- 00:08:49.290 [2024-12-16 10:55:47.714619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.290 [2024-12-16 10:55:47.714648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.714700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.290 [2024-12-16 10:55:47.714716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.290 #34 NEW cov: 11866 ft: 15019 corp: 24/844b lim: 50 exec/s: 34 rss: 68Mb L: 29/45 MS: 1 EraseBytes- 00:08:49.290 [2024-12-16 10:55:47.755055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.290 [2024-12-16 10:55:47.755082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.755133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.290 [2024-12-16 10:55:47.755149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.755205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.290 [2024-12-16 10:55:47.755221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.755280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.290 [2024-12-16 10:55:47.755296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.290 #35 NEW cov: 11866 ft: 15028 corp: 25/884b lim: 50 exec/s: 35 rss: 68Mb L: 40/45 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:49.290 [2024-12-16 10:55:47.794702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.290 [2024-12-16 10:55:47.794729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 #36 NEW cov: 11866 ft: 15045 corp: 26/897b lim: 50 exec/s: 36 rss: 68Mb L: 13/45 MS: 1 ShuffleBytes- 00:08:49.290 [2024-12-16 10:55:47.834816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.290 [2024-12-16 10:55:47.834843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 #42 NEW cov: 11866 ft: 15072 corp: 27/914b lim: 50 exec/s: 42 rss: 68Mb L: 17/45 MS: 1 CrossOver- 00:08:49.290 [2024-12-16 10:55:47.875466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.290 [2024-12-16 10:55:47.875493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.875545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.290 [2024-12-16 10:55:47.875561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.875623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.290 [2024-12-16 10:55:47.875639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.290 [2024-12-16 10:55:47.875697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.290 [2024-12-16 10:55:47.875712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.290 #43 NEW cov: 11866 ft: 15085 corp: 28/954b lim: 50 exec/s: 43 rss: 68Mb L: 40/45 MS: 1 ChangeBit- 00:08:49.550 [2024-12-16 10:55:47.915097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.550 [2024-12-16 10:55:47.915125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.550 #44 NEW cov: 11866 ft: 15091 corp: 29/968b lim: 50 exec/s: 44 rss: 68Mb L: 14/45 MS: 1 ChangeBinInt- 00:08:49.550 [2024-12-16 10:55:47.955691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.550 [2024-12-16 10:55:47.955718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:47.955769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.550 [2024-12-16 10:55:47.955786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:47.955858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.550 [2024-12-16 10:55:47.955874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:47.955933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.550 [2024-12-16 10:55:47.955948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.550 #45 NEW cov: 11866 ft: 15105 corp: 30/1013b lim: 50 exec/s: 45 rss: 68Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:49.550 [2024-12-16 10:55:48.005861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.550 [2024-12-16 10:55:48.005889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:48.005933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.550 [2024-12-16 10:55:48.005949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:48.005989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.550 [2024-12-16 10:55:48.006005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:48.006065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.550 [2024-12-16 10:55:48.006082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.550 #46 NEW cov: 11866 ft: 15125 corp: 31/1053b lim: 50 exec/s: 46 rss: 69Mb L: 40/45 MS: 1 InsertByte- 00:08:49.550 [2024-12-16 10:55:48.045805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.550 [2024-12-16 10:55:48.045831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.550 [2024-12-16 10:55:48.045871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.550 [2024-12-16 10:55:48.045887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.045944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.551 [2024-12-16 10:55:48.045959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.551 #47 NEW cov: 11866 ft: 15131 corp: 32/1089b lim: 50 exec/s: 47 rss: 69Mb L: 36/45 MS: 1 InsertRepeatedBytes- 00:08:49.551 [2024-12-16 10:55:48.085878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.551 [2024-12-16 10:55:48.085905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.085947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.551 [2024-12-16 10:55:48.085963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.086023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.551 [2024-12-16 10:55:48.086040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.551 #48 NEW cov: 11866 ft: 15212 corp: 33/1124b lim: 50 exec/s: 48 rss: 69Mb L: 35/45 MS: 1 EraseBytes- 00:08:49.551 [2024-12-16 10:55:48.126187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.551 [2024-12-16 10:55:48.126214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.126263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.551 [2024-12-16 10:55:48.126278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.126336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.551 [2024-12-16 10:55:48.126352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.126409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.551 [2024-12-16 10:55:48.126428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.551 #49 NEW cov: 11866 ft: 15218 corp: 34/1169b lim: 50 exec/s: 49 rss: 69Mb L: 45/45 MS: 1 ChangeBinInt- 00:08:49.551 [2024-12-16 10:55:48.166277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.551 [2024-12-16 10:55:48.166304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.166356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.551 [2024-12-16 10:55:48.166373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.166428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.551 [2024-12-16 10:55:48.166442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.551 [2024-12-16 10:55:48.166496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.551 [2024-12-16 10:55:48.166512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.811 #50 NEW cov: 11866 ft: 15221 corp: 35/1217b lim: 50 exec/s: 50 rss: 69Mb L: 48/48 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:49.811 [2024-12-16 10:55:48.206421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.811 [2024-12-16 10:55:48.206449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.206506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.811 [2024-12-16 10:55:48.206521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.206578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.811 [2024-12-16 10:55:48.206595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.206659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.811 [2024-12-16 10:55:48.206676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.811 #51 NEW cov: 11866 ft: 15226 corp: 36/1261b lim: 50 exec/s: 51 rss: 69Mb L: 44/48 MS: 1 CopyPart- 00:08:49.811 [2024-12-16 10:55:48.246508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.811 [2024-12-16 10:55:48.246534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.246584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.811 [2024-12-16 10:55:48.246600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.246662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.811 [2024-12-16 10:55:48.246678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.246734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.811 [2024-12-16 10:55:48.246749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.811 #52 NEW cov: 11866 ft: 15235 corp: 37/1308b lim: 50 exec/s: 52 rss: 69Mb L: 47/48 MS: 1 InsertRepeatedBytes- 00:08:49.811 [2024-12-16 10:55:48.286503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.811 [2024-12-16 10:55:48.286532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.286588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.811 [2024-12-16 10:55:48.286605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.286669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.811 [2024-12-16 10:55:48.286686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.811 #53 NEW cov: 11866 ft: 15241 corp: 38/1347b lim: 50 exec/s: 53 rss: 69Mb L: 39/48 MS: 1 ChangeBinInt- 00:08:49.811 [2024-12-16 10:55:48.326769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.811 [2024-12-16 10:55:48.326795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.326843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.811 [2024-12-16 10:55:48.326859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.326916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.811 [2024-12-16 10:55:48.326933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.326991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.811 [2024-12-16 10:55:48.327007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.811 #54 NEW cov: 11866 ft: 15303 corp: 39/1388b lim: 50 exec/s: 54 rss: 69Mb L: 41/48 MS: 1 ShuffleBytes- 00:08:49.811 [2024-12-16 10:55:48.366879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.811 [2024-12-16 10:55:48.366906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.366957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.811 [2024-12-16 10:55:48.366974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.367030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.811 [2024-12-16 10:55:48.367047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.367105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.811 [2024-12-16 10:55:48.367121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.811 #55 NEW cov: 11866 ft: 15304 corp: 40/1428b lim: 50 exec/s: 55 rss: 69Mb L: 40/48 MS: 1 InsertByte- 00:08:49.811 [2024-12-16 10:55:48.407016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.811 [2024-12-16 10:55:48.407042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.407108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.811 [2024-12-16 10:55:48.407124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.407185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.811 [2024-12-16 10:55:48.407212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.811 [2024-12-16 10:55:48.407268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.811 [2024-12-16 10:55:48.407283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.812 #56 NEW cov: 11866 ft: 15350 corp: 41/1469b lim: 50 exec/s: 56 rss: 69Mb L: 41/48 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:50.072 [2024-12-16 10:55:48.447112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.072 [2024-12-16 10:55:48.447140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.447182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:50.072 [2024-12-16 10:55:48.447198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.447258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:50.072 [2024-12-16 10:55:48.447274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.447335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:50.072 [2024-12-16 10:55:48.447351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.072 #57 NEW cov: 11866 ft: 15381 corp: 42/1513b lim: 50 exec/s: 57 rss: 69Mb L: 44/48 MS: 1 CMP- DE: "\013\000\000\000"- 00:08:50.072 [2024-12-16 10:55:48.487008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.072 [2024-12-16 10:55:48.487035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.487075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:50.072 [2024-12-16 10:55:48.487092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.487150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:50.072 [2024-12-16 10:55:48.487182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.072 #58 NEW cov: 11866 ft: 15389 corp: 43/1549b lim: 50 exec/s: 58 rss: 69Mb L: 36/48 MS: 1 EraseBytes- 00:08:50.072 [2024-12-16 10:55:48.527468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.072 [2024-12-16 10:55:48.527495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.527544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:50.072 [2024-12-16 10:55:48.527560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.527617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:50.072 [2024-12-16 10:55:48.527632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.527690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:50.072 [2024-12-16 10:55:48.527708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.072 [2024-12-16 10:55:48.527764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:50.072 [2024-12-16 10:55:48.527779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.072 #59 NEW cov: 11866 ft: 15435 corp: 44/1599b lim: 50 exec/s: 29 rss: 70Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:50.072 #59 DONE cov: 11866 ft: 15435 corp: 44/1599b lim: 50 exec/s: 29 rss: 70Mb 00:08:50.072 ###### Recommended dictionary. ###### 00:08:50.072 "\000\000\000\000\000\000\000\000" # Uses: 3 00:08:50.072 "\013\000\000\000" # Uses: 0 00:08:50.072 ###### End of recommended dictionary. ###### 00:08:50.072 Done 59 runs in 2 second(s) 00:08:50.072 10:55:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:50.072 10:55:48 -- ../common.sh@72 -- # (( i++ )) 00:08:50.072 10:55:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.072 10:55:48 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:50.072 10:55:48 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:50.072 10:55:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:50.072 10:55:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.072 10:55:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:50.072 10:55:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:50.072 10:55:48 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:50.072 10:55:48 -- nvmf/run.sh@29 -- # port=4422 00:08:50.072 10:55:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:50.072 10:55:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:50.072 10:55:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.072 10:55:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:50.331 [2024-12-16 10:55:48.708954] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:50.331 [2024-12-16 10:55:48.709043] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660744 ] 00:08:50.331 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.331 [2024-12-16 10:55:48.888585] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.331 [2024-12-16 10:55:48.907705] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:50.331 [2024-12-16 10:55:48.907830] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.591 [2024-12-16 10:55:48.959227] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.591 [2024-12-16 10:55:48.975402] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:50.591 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.591 INFO: Seed: 559376436 00:08:50.591 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:50.591 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:50.591 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:50.591 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.591 #2 INITED exec/s: 0 rss: 59Mb 00:08:50.591 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.591 This may also happen if the target rejected all inputs we tried so far 00:08:50.591 [2024-12-16 10:55:49.041056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.591 [2024-12-16 10:55:49.041092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.591 [2024-12-16 10:55:49.041143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.591 [2024-12-16 10:55:49.041159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.591 [2024-12-16 10:55:49.041213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.591 [2024-12-16 10:55:49.041228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.591 [2024-12-16 10:55:49.041281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.591 [2024-12-16 10:55:49.041296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.851 NEW_FUNC[1/672]: 0x47f938 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:50.851 NEW_FUNC[2/672]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.851 #21 NEW cov: 11665 ft: 11666 corp: 2/81b lim: 85 exec/s: 0 rss: 66Mb L: 80/80 MS: 4 InsertByte-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:50.851 [2024-12-16 10:55:49.352101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.851 [2024-12-16 10:55:49.352147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.352218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.851 [2024-12-16 10:55:49.352242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.352318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.851 [2024-12-16 10:55:49.352341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.352411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.851 [2024-12-16 10:55:49.352435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.352503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:50.851 [2024-12-16 10:55:49.352528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.851 #27 NEW cov: 11778 ft: 12233 corp: 3/166b lim: 85 exec/s: 0 rss: 66Mb L: 85/85 MS: 1 CrossOver- 00:08:50.851 [2024-12-16 10:55:49.401477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.851 [2024-12-16 10:55:49.401505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.851 #30 NEW cov: 11784 ft: 13419 corp: 4/196b lim: 85 exec/s: 0 rss: 66Mb L: 30/85 MS: 3 CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:08:50.851 [2024-12-16 10:55:49.442142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.851 [2024-12-16 10:55:49.442170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.442234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.851 [2024-12-16 10:55:49.442250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.442306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.851 [2024-12-16 10:55:49.442322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.442374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.851 [2024-12-16 10:55:49.442390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.851 [2024-12-16 10:55:49.442445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:50.851 [2024-12-16 10:55:49.442460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.851 #31 NEW cov: 11869 ft: 13741 corp: 5/281b lim: 85 exec/s: 0 rss: 66Mb L: 85/85 MS: 1 ShuffleBytes- 00:08:51.111 [2024-12-16 10:55:49.481656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.111 [2024-12-16 10:55:49.481684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.111 #32 NEW cov: 11869 ft: 13975 corp: 6/311b lim: 85 exec/s: 0 rss: 66Mb L: 30/85 MS: 1 ChangeByte- 00:08:51.111 [2024-12-16 10:55:49.522203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.111 [2024-12-16 10:55:49.522228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.522276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.111 [2024-12-16 10:55:49.522291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.522344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.111 [2024-12-16 10:55:49.522359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.522412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.111 [2024-12-16 10:55:49.522427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.111 #33 NEW cov: 11869 ft: 14036 corp: 7/394b lim: 85 exec/s: 0 rss: 66Mb L: 83/85 MS: 1 EraseBytes- 00:08:51.111 [2024-12-16 10:55:49.562166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.111 [2024-12-16 10:55:49.562192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.562233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.111 [2024-12-16 10:55:49.562248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.562300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.111 [2024-12-16 10:55:49.562316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.111 #34 NEW cov: 11869 ft: 14451 corp: 8/454b lim: 85 exec/s: 0 rss: 66Mb L: 60/85 MS: 1 CrossOver- 00:08:51.111 [2024-12-16 10:55:49.602569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.111 [2024-12-16 10:55:49.602595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.602654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.111 [2024-12-16 10:55:49.602673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.111 [2024-12-16 10:55:49.602743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.111 [2024-12-16 10:55:49.602759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.602812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.112 [2024-12-16 10:55:49.602828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.602886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.112 [2024-12-16 10:55:49.602905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.112 #35 NEW cov: 11869 ft: 14582 corp: 9/539b lim: 85 exec/s: 0 rss: 66Mb L: 85/85 MS: 1 ChangeByte- 00:08:51.112 [2024-12-16 10:55:49.642561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.112 [2024-12-16 10:55:49.642587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.642628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.112 [2024-12-16 10:55:49.642645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.642697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.112 [2024-12-16 10:55:49.642712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.642768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.112 [2024-12-16 10:55:49.642783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.112 #36 NEW cov: 11869 ft: 14599 corp: 10/622b lim: 85 exec/s: 0 rss: 66Mb L: 83/85 MS: 1 ChangeBinInt- 00:08:51.112 [2024-12-16 10:55:49.682825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.112 [2024-12-16 10:55:49.682851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.682900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.112 [2024-12-16 10:55:49.682915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.682967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.112 [2024-12-16 10:55:49.682982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.683037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.112 [2024-12-16 10:55:49.683051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.683103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.112 [2024-12-16 10:55:49.683119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.112 #37 NEW cov: 11869 ft: 14635 corp: 11/707b lim: 85 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 ChangeByte- 00:08:51.112 [2024-12-16 10:55:49.722833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.112 [2024-12-16 10:55:49.722863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.722901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.112 [2024-12-16 10:55:49.722917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.722971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.112 [2024-12-16 10:55:49.722986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.112 [2024-12-16 10:55:49.723041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.112 [2024-12-16 10:55:49.723056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 #38 NEW cov: 11869 ft: 14694 corp: 12/778b lim: 85 exec/s: 0 rss: 67Mb L: 71/85 MS: 1 InsertRepeatedBytes- 00:08:51.372 [2024-12-16 10:55:49.763021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-12-16 10:55:49.763047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.763101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-12-16 10:55:49.763117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.763170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-12-16 10:55:49.763185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.763238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-12-16 10:55:49.763253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.763307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.372 [2024-12-16 10:55:49.763321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.372 #39 NEW cov: 11869 ft: 14726 corp: 13/863b lim: 85 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 ChangeBit- 00:08:51.372 [2024-12-16 10:55:49.803167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-12-16 10:55:49.803194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.803243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-12-16 10:55:49.803259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.803314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-12-16 10:55:49.803329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.803381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-12-16 10:55:49.803397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.803450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.372 [2024-12-16 10:55:49.803468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.372 #40 NEW cov: 11869 ft: 14779 corp: 14/948b lim: 85 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 CopyPart- 00:08:51.372 [2024-12-16 10:55:49.843304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-12-16 10:55:49.843331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.843381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-12-16 10:55:49.843396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.843449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-12-16 10:55:49.843464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.843518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-12-16 10:55:49.843534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.843588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.372 [2024-12-16 10:55:49.843603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.372 #41 NEW cov: 11869 ft: 14791 corp: 15/1033b lim: 85 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 CMP- DE: "\001\004\000\000\000\000\000\000"- 00:08:51.372 [2024-12-16 10:55:49.883358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-12-16 10:55:49.883385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.883452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-12-16 10:55:49.883468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.883523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-12-16 10:55:49.883539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.883595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-12-16 10:55:49.883615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.883670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.372 [2024-12-16 10:55:49.883686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.372 #42 NEW cov: 11869 ft: 14796 corp: 16/1118b lim: 85 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 ChangeByte- 00:08:51.372 [2024-12-16 10:55:49.923483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-12-16 10:55:49.923509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.923579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-12-16 10:55:49.923595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.923657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-12-16 10:55:49.923672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.923726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-12-16 10:55:49.923741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.923796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.372 [2024-12-16 10:55:49.923810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.372 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:51.372 #43 NEW cov: 11892 ft: 14863 corp: 17/1203b lim: 85 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 PersAutoDict- DE: "\001\004\000\000\000\000\000\000"- 00:08:51.372 [2024-12-16 10:55:49.963464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-12-16 10:55:49.963491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.963529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-12-16 10:55:49.963544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.963598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-12-16 10:55:49.963615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-12-16 10:55:49.963672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-12-16 10:55:49.963687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 #49 NEW cov: 11892 ft: 14895 corp: 18/1286b lim: 85 exec/s: 0 rss: 67Mb L: 83/85 MS: 1 ChangeBinInt- 00:08:51.632 [2024-12-16 10:55:50.003477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.632 [2024-12-16 10:55:50.003505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.003543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.632 [2024-12-16 10:55:50.003559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.003619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.632 [2024-12-16 10:55:50.003634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.632 #50 NEW cov: 11892 ft: 14907 corp: 19/1353b lim: 85 exec/s: 50 rss: 67Mb L: 67/85 MS: 1 InsertRepeatedBytes- 00:08:51.632 [2024-12-16 10:55:50.043797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.632 [2024-12-16 10:55:50.043824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.043875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.632 [2024-12-16 10:55:50.043892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.043946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.632 [2024-12-16 10:55:50.043964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.044020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.632 [2024-12-16 10:55:50.044035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.632 #51 NEW cov: 11892 ft: 14920 corp: 20/1437b lim: 85 exec/s: 51 rss: 67Mb L: 84/85 MS: 1 InsertByte- 00:08:51.632 [2024-12-16 10:55:50.083801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.632 [2024-12-16 10:55:50.083830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.083868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.632 [2024-12-16 10:55:50.083885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.083938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.632 [2024-12-16 10:55:50.083954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.632 [2024-12-16 10:55:50.084008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.632 [2024-12-16 10:55:50.084023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.632 #52 NEW cov: 11892 ft: 14931 corp: 21/1520b lim: 85 exec/s: 52 rss: 67Mb L: 83/85 MS: 1 CopyPart- 00:08:51.632 [2024-12-16 10:55:50.124078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.633 [2024-12-16 10:55:50.124105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.124172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.633 [2024-12-16 10:55:50.124188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.124253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.633 [2024-12-16 10:55:50.124269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.124322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.633 [2024-12-16 10:55:50.124336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.124390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.633 [2024-12-16 10:55:50.124405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.633 #53 NEW cov: 11892 ft: 14939 corp: 22/1605b lim: 85 exec/s: 53 rss: 67Mb L: 85/85 MS: 1 ChangeBinInt- 00:08:51.633 [2024-12-16 10:55:50.164210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.633 [2024-12-16 10:55:50.164236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.164289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.633 [2024-12-16 10:55:50.164305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.164358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.633 [2024-12-16 10:55:50.164394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.164446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.633 [2024-12-16 10:55:50.164462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.164515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.633 [2024-12-16 10:55:50.164531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.633 #54 NEW cov: 11892 ft: 14942 corp: 23/1690b lim: 85 exec/s: 54 rss: 67Mb L: 85/85 MS: 1 ChangeByte- 00:08:51.633 [2024-12-16 10:55:50.203717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.633 [2024-12-16 10:55:50.203744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.633 #57 NEW cov: 11892 ft: 15007 corp: 24/1713b lim: 85 exec/s: 57 rss: 67Mb L: 23/85 MS: 3 CrossOver-ChangeByte-PersAutoDict- DE: "\001\004\000\000\000\000\000\000"- 00:08:51.633 [2024-12-16 10:55:50.244201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.633 [2024-12-16 10:55:50.244228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.244279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.633 [2024-12-16 10:55:50.244295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.633 [2024-12-16 10:55:50.244350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.633 [2024-12-16 10:55:50.244366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.893 #58 NEW cov: 11892 ft: 15019 corp: 25/1780b lim: 85 exec/s: 58 rss: 67Mb L: 67/85 MS: 1 ChangeBit- 00:08:51.893 [2024-12-16 10:55:50.284398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.893 [2024-12-16 10:55:50.284425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.284461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.893 [2024-12-16 10:55:50.284476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.284531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.893 [2024-12-16 10:55:50.284546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.284601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.893 [2024-12-16 10:55:50.284622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.893 #59 NEW cov: 11892 ft: 15026 corp: 26/1851b lim: 85 exec/s: 59 rss: 67Mb L: 71/85 MS: 1 EraseBytes- 00:08:51.893 [2024-12-16 10:55:50.324111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.893 [2024-12-16 10:55:50.324138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.893 #60 NEW cov: 11892 ft: 15047 corp: 27/1881b lim: 85 exec/s: 60 rss: 67Mb L: 30/85 MS: 1 ChangeBinInt- 00:08:51.893 [2024-12-16 10:55:50.364825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.893 [2024-12-16 10:55:50.364852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.364903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.893 [2024-12-16 10:55:50.364919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.364974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.893 [2024-12-16 10:55:50.364990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.365044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.893 [2024-12-16 10:55:50.365059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.365115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.893 [2024-12-16 10:55:50.365131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.893 #61 NEW cov: 11892 ft: 15067 corp: 28/1966b lim: 85 exec/s: 61 rss: 67Mb L: 85/85 MS: 1 CopyPart- 00:08:51.893 [2024-12-16 10:55:50.404765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.893 [2024-12-16 10:55:50.404793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.404830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.893 [2024-12-16 10:55:50.404846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.404902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.893 [2024-12-16 10:55:50.404917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.404973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.893 [2024-12-16 10:55:50.404989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.893 #62 NEW cov: 11892 ft: 15075 corp: 29/2049b lim: 85 exec/s: 62 rss: 67Mb L: 83/85 MS: 1 ChangeBit- 00:08:51.893 [2024-12-16 10:55:50.444428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.893 [2024-12-16 10:55:50.444454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.893 #63 NEW cov: 11892 ft: 15133 corp: 30/2079b lim: 85 exec/s: 63 rss: 67Mb L: 30/85 MS: 1 ChangeByte- 00:08:51.893 [2024-12-16 10:55:50.485136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.893 [2024-12-16 10:55:50.485163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.485226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.893 [2024-12-16 10:55:50.485243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.485294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.893 [2024-12-16 10:55:50.485310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.485366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.893 [2024-12-16 10:55:50.485382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.893 [2024-12-16 10:55:50.485426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:51.893 [2024-12-16 10:55:50.485441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.893 #64 NEW cov: 11892 ft: 15160 corp: 31/2164b lim: 85 exec/s: 64 rss: 67Mb L: 85/85 MS: 1 ChangeBinInt- 00:08:52.153 [2024-12-16 10:55:50.525135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.153 [2024-12-16 10:55:50.525161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.525213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.153 [2024-12-16 10:55:50.525229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.525281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.153 [2024-12-16 10:55:50.525297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.525351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.153 [2024-12-16 10:55:50.525365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.153 #65 NEW cov: 11892 ft: 15204 corp: 32/2244b lim: 85 exec/s: 65 rss: 68Mb L: 80/85 MS: 1 CopyPart- 00:08:52.153 [2024-12-16 10:55:50.565352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.153 [2024-12-16 10:55:50.565379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.565425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.153 [2024-12-16 10:55:50.565440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.565493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.153 [2024-12-16 10:55:50.565509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.565560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.153 [2024-12-16 10:55:50.565576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.565633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:52.153 [2024-12-16 10:55:50.565648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.153 #66 NEW cov: 11892 ft: 15270 corp: 33/2329b lim: 85 exec/s: 66 rss: 68Mb L: 85/85 MS: 1 ChangeBinInt- 00:08:52.153 [2024-12-16 10:55:50.605332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.153 [2024-12-16 10:55:50.605359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.605400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.153 [2024-12-16 10:55:50.605415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.605470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.153 [2024-12-16 10:55:50.605485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.605539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.153 [2024-12-16 10:55:50.605554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.153 #67 NEW cov: 11892 ft: 15296 corp: 34/2402b lim: 85 exec/s: 67 rss: 68Mb L: 73/85 MS: 1 InsertRepeatedBytes- 00:08:52.153 [2024-12-16 10:55:50.645446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.153 [2024-12-16 10:55:50.645473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.645512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.153 [2024-12-16 10:55:50.645528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.645581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.153 [2024-12-16 10:55:50.645595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.153 [2024-12-16 10:55:50.645652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.153 [2024-12-16 10:55:50.645667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.153 #68 NEW cov: 11892 ft: 15311 corp: 35/2485b lim: 85 exec/s: 68 rss: 68Mb L: 83/85 MS: 1 ChangeByte- 00:08:52.153 [2024-12-16 10:55:50.685557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.153 [2024-12-16 10:55:50.685585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.685648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.154 [2024-12-16 10:55:50.685665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.685717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.154 [2024-12-16 10:55:50.685732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.685786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.154 [2024-12-16 10:55:50.685800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.154 #74 NEW cov: 11892 ft: 15323 corp: 36/2555b lim: 85 exec/s: 74 rss: 68Mb L: 70/85 MS: 1 EraseBytes- 00:08:52.154 [2024-12-16 10:55:50.725692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.154 [2024-12-16 10:55:50.725718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.725766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.154 [2024-12-16 10:55:50.725782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.725835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.154 [2024-12-16 10:55:50.725871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.725925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.154 [2024-12-16 10:55:50.725939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.154 #75 NEW cov: 11892 ft: 15339 corp: 37/2638b lim: 85 exec/s: 75 rss: 68Mb L: 83/85 MS: 1 ChangeBit- 00:08:52.154 [2024-12-16 10:55:50.765970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.154 [2024-12-16 10:55:50.765997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.766045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.154 [2024-12-16 10:55:50.766060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.766111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.154 [2024-12-16 10:55:50.766125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.766175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.154 [2024-12-16 10:55:50.766190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.154 [2024-12-16 10:55:50.766243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:52.154 [2024-12-16 10:55:50.766258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.414 #76 NEW cov: 11892 ft: 15349 corp: 38/2723b lim: 85 exec/s: 76 rss: 68Mb L: 85/85 MS: 1 CopyPart- 00:08:52.414 [2024-12-16 10:55:50.805479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.414 [2024-12-16 10:55:50.805505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.414 #79 NEW cov: 11892 ft: 15355 corp: 39/2746b lim: 85 exec/s: 79 rss: 68Mb L: 23/85 MS: 3 InsertByte-ChangeASCIIInt-InsertRepeatedBytes- 00:08:52.414 [2024-12-16 10:55:50.846196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.414 [2024-12-16 10:55:50.846223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.846271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.414 [2024-12-16 10:55:50.846286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.846341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.414 [2024-12-16 10:55:50.846356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.846409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.414 [2024-12-16 10:55:50.846424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.846476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:52.414 [2024-12-16 10:55:50.846491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.414 #80 NEW cov: 11892 ft: 15372 corp: 40/2831b lim: 85 exec/s: 80 rss: 68Mb L: 85/85 MS: 1 ChangeByte- 00:08:52.414 [2024-12-16 10:55:50.886137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.414 [2024-12-16 10:55:50.886162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.886226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.414 [2024-12-16 10:55:50.886243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.886296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.414 [2024-12-16 10:55:50.886312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.886368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.414 [2024-12-16 10:55:50.886384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.414 #81 NEW cov: 11892 ft: 15386 corp: 41/2914b lim: 85 exec/s: 81 rss: 68Mb L: 83/85 MS: 1 ChangeBinInt- 00:08:52.414 [2024-12-16 10:55:50.925808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.414 [2024-12-16 10:55:50.925835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.414 #82 NEW cov: 11892 ft: 15414 corp: 42/2936b lim: 85 exec/s: 82 rss: 68Mb L: 22/85 MS: 1 EraseBytes- 00:08:52.414 [2024-12-16 10:55:50.966387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.414 [2024-12-16 10:55:50.966413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.966456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.414 [2024-12-16 10:55:50.966470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.966522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.414 [2024-12-16 10:55:50.966538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:50.966590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.414 [2024-12-16 10:55:50.966605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.414 #83 NEW cov: 11892 ft: 15421 corp: 43/3019b lim: 85 exec/s: 83 rss: 68Mb L: 83/85 MS: 1 ChangeByte- 00:08:52.414 [2024-12-16 10:55:51.006524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.414 [2024-12-16 10:55:51.006551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:51.006588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.414 [2024-12-16 10:55:51.006602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:51.006658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.414 [2024-12-16 10:55:51.006673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.414 [2024-12-16 10:55:51.006743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.414 [2024-12-16 10:55:51.006760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.414 #84 NEW cov: 11892 ft: 15427 corp: 44/3090b lim: 85 exec/s: 42 rss: 68Mb L: 71/85 MS: 1 ChangeByte- 00:08:52.414 #84 DONE cov: 11892 ft: 15427 corp: 44/3090b lim: 85 exec/s: 42 rss: 68Mb 00:08:52.414 ###### Recommended dictionary. ###### 00:08:52.414 "\001\004\000\000\000\000\000\000" # Uses: 3 00:08:52.414 ###### End of recommended dictionary. ###### 00:08:52.414 Done 84 runs in 2 second(s) 00:08:52.674 10:55:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:52.674 10:55:51 -- ../common.sh@72 -- # (( i++ )) 00:08:52.674 10:55:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.674 10:55:51 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:52.674 10:55:51 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:52.674 10:55:51 -- nvmf/run.sh@24 -- # local timen=1 00:08:52.674 10:55:51 -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.674 10:55:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:52.674 10:55:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:52.674 10:55:51 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:52.674 10:55:51 -- nvmf/run.sh@29 -- # port=4423 00:08:52.674 10:55:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:52.674 10:55:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:52.674 10:55:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.674 10:55:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:52.674 [2024-12-16 10:55:51.187532] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:52.674 [2024-12-16 10:55:51.187601] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661277 ] 00:08:52.674 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.933 [2024-12-16 10:55:51.365152] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.933 [2024-12-16 10:55:51.385567] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.934 [2024-12-16 10:55:51.385714] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.934 [2024-12-16 10:55:51.436989] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:52.934 [2024-12-16 10:55:51.453325] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:52.934 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.934 INFO: Seed: 3036368526 00:08:52.934 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:52.934 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:52.934 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:52.934 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.934 #2 INITED exec/s: 0 rss: 59Mb 00:08:52.934 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.934 This may also happen if the target rejected all inputs we tried so far 00:08:52.934 [2024-12-16 10:55:51.498299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.934 [2024-12-16 10:55:51.498329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.193 NEW_FUNC[1/671]: 0x482b78 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:53.193 NEW_FUNC[2/671]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.193 #5 NEW cov: 11598 ft: 11599 corp: 2/6b lim: 25 exec/s: 0 rss: 65Mb L: 5/5 MS: 3 InsertRepeatedBytes-ChangeByte-CrossOver- 00:08:53.193 [2024-12-16 10:55:51.809363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.193 [2024-12-16 10:55:51.809394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.193 [2024-12-16 10:55:51.809468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.193 [2024-12-16 10:55:51.809484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.453 #8 NEW cov: 11711 ft: 12390 corp: 3/16b lim: 25 exec/s: 0 rss: 66Mb L: 10/10 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:53.453 [2024-12-16 10:55:51.849432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.453 [2024-12-16 10:55:51.849460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.453 [2024-12-16 10:55:51.849500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.453 [2024-12-16 10:55:51.849516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.453 #10 NEW cov: 11717 ft: 12660 corp: 4/29b lim: 25 exec/s: 0 rss: 67Mb L: 13/13 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:53.453 [2024-12-16 10:55:51.889432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.453 [2024-12-16 10:55:51.889459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.453 #16 NEW cov: 11802 ft: 12989 corp: 5/34b lim: 25 exec/s: 0 rss: 67Mb L: 5/13 MS: 1 EraseBytes- 00:08:53.453 [2024-12-16 10:55:51.929582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.453 [2024-12-16 10:55:51.929615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.453 #21 NEW cov: 11802 ft: 13213 corp: 6/40b lim: 25 exec/s: 0 rss: 67Mb L: 6/13 MS: 5 ChangeBit-ChangeBit-InsertByte-EraseBytes-CrossOver- 00:08:53.453 [2024-12-16 10:55:51.969765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.453 [2024-12-16 10:55:51.969792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.453 [2024-12-16 10:55:51.969836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.453 [2024-12-16 10:55:51.969852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.453 #27 NEW cov: 11802 ft: 13319 corp: 7/53b lim: 25 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:53.453 [2024-12-16 10:55:52.010006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.453 [2024-12-16 10:55:52.010033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.453 [2024-12-16 10:55:52.010072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.453 [2024-12-16 10:55:52.010087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.453 [2024-12-16 10:55:52.010142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.453 [2024-12-16 10:55:52.010156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.453 #28 NEW cov: 11802 ft: 13609 corp: 8/70b lim: 25 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 CopyPart- 00:08:53.453 [2024-12-16 10:55:52.049999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.453 [2024-12-16 10:55:52.050025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.453 [2024-12-16 10:55:52.050066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.453 [2024-12-16 10:55:52.050082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.453 #29 NEW cov: 11802 ft: 13622 corp: 9/83b lim: 25 exec/s: 0 rss: 67Mb L: 13/17 MS: 1 ChangeBinInt- 00:08:53.713 [2024-12-16 10:55:52.089951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.089978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.713 #34 NEW cov: 11802 ft: 13664 corp: 10/88b lim: 25 exec/s: 0 rss: 67Mb L: 5/17 MS: 5 EraseBytes-CrossOver-ChangeBinInt-CopyPart-InsertByte- 00:08:53.713 [2024-12-16 10:55:52.130364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.130391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.130430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.713 [2024-12-16 10:55:52.130446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.130500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.713 [2024-12-16 10:55:52.130516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.713 #35 NEW cov: 11802 ft: 13761 corp: 11/105b lim: 25 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 ChangeBit- 00:08:53.713 [2024-12-16 10:55:52.170359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.170386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.170425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.713 [2024-12-16 10:55:52.170440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.713 #36 NEW cov: 11802 ft: 13803 corp: 12/118b lim: 25 exec/s: 0 rss: 67Mb L: 13/17 MS: 1 ChangeByte- 00:08:53.713 [2024-12-16 10:55:52.200534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.200561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.200602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.713 [2024-12-16 10:55:52.200624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.200681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.713 [2024-12-16 10:55:52.200696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.713 #37 NEW cov: 11802 ft: 13847 corp: 13/135b lim: 25 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 ChangeBit- 00:08:53.713 [2024-12-16 10:55:52.240434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.240461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.713 #38 NEW cov: 11802 ft: 13866 corp: 14/140b lim: 25 exec/s: 0 rss: 67Mb L: 5/17 MS: 1 ChangeBinInt- 00:08:53.713 [2024-12-16 10:55:52.280904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.280931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.280982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.713 [2024-12-16 10:55:52.280998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.281053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.713 [2024-12-16 10:55:52.281069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.713 [2024-12-16 10:55:52.281123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:53.713 [2024-12-16 10:55:52.281139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.713 #44 NEW cov: 11802 ft: 14303 corp: 15/162b lim: 25 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:53.713 [2024-12-16 10:55:52.320666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.713 [2024-12-16 10:55:52.320692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 #45 NEW cov: 11802 ft: 14343 corp: 16/168b lim: 25 exec/s: 0 rss: 68Mb L: 6/22 MS: 1 ChangeASCIIInt- 00:08:53.973 [2024-12-16 10:55:52.360778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.973 [2024-12-16 10:55:52.360805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 #46 NEW cov: 11802 ft: 14443 corp: 17/175b lim: 25 exec/s: 0 rss: 68Mb L: 7/22 MS: 1 EraseBytes- 00:08:53.973 [2024-12-16 10:55:52.401011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.973 [2024-12-16 10:55:52.401038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 [2024-12-16 10:55:52.401078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.973 [2024-12-16 10:55:52.401094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.973 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.973 #47 NEW cov: 11825 ft: 14520 corp: 18/186b lim: 25 exec/s: 0 rss: 68Mb L: 11/22 MS: 1 EraseBytes- 00:08:53.973 [2024-12-16 10:55:52.451038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.973 [2024-12-16 10:55:52.451066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 #48 NEW cov: 11825 ft: 14534 corp: 19/192b lim: 25 exec/s: 0 rss: 68Mb L: 6/22 MS: 1 InsertByte- 00:08:53.973 [2024-12-16 10:55:52.491279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.973 [2024-12-16 10:55:52.491305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 [2024-12-16 10:55:52.491347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.973 [2024-12-16 10:55:52.491364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.973 #49 NEW cov: 11825 ft: 14634 corp: 20/203b lim: 25 exec/s: 49 rss: 68Mb L: 11/22 MS: 1 CrossOver- 00:08:53.973 [2024-12-16 10:55:52.531276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.973 [2024-12-16 10:55:52.531307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 #50 NEW cov: 11825 ft: 14675 corp: 21/208b lim: 25 exec/s: 50 rss: 68Mb L: 5/22 MS: 1 ChangeASCIIInt- 00:08:53.973 [2024-12-16 10:55:52.571477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.973 [2024-12-16 10:55:52.571503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.973 [2024-12-16 10:55:52.571543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.973 [2024-12-16 10:55:52.571558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.973 #51 NEW cov: 11825 ft: 14692 corp: 22/218b lim: 25 exec/s: 51 rss: 68Mb L: 10/22 MS: 1 ChangeBinInt- 00:08:54.232 [2024-12-16 10:55:52.611576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.232 [2024-12-16 10:55:52.611604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.232 [2024-12-16 10:55:52.611650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.232 [2024-12-16 10:55:52.611666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.232 #52 NEW cov: 11825 ft: 14705 corp: 23/231b lim: 25 exec/s: 52 rss: 68Mb L: 13/22 MS: 1 ShuffleBytes- 00:08:54.232 [2024-12-16 10:55:52.651574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.232 [2024-12-16 10:55:52.651601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.232 #53 NEW cov: 11825 ft: 14739 corp: 24/238b lim: 25 exec/s: 53 rss: 68Mb L: 7/22 MS: 1 ChangeByte- 00:08:54.232 [2024-12-16 10:55:52.691741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.232 [2024-12-16 10:55:52.691768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.232 #54 NEW cov: 11825 ft: 14776 corp: 25/245b lim: 25 exec/s: 54 rss: 68Mb L: 7/22 MS: 1 InsertByte- 00:08:54.232 [2024-12-16 10:55:52.732233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.232 [2024-12-16 10:55:52.732260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.232 [2024-12-16 10:55:52.732297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.232 [2024-12-16 10:55:52.732312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.232 [2024-12-16 10:55:52.732364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.232 [2024-12-16 10:55:52.732380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.233 [2024-12-16 10:55:52.732436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.233 [2024-12-16 10:55:52.732451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.233 #55 NEW cov: 11825 ft: 14807 corp: 26/268b lim: 25 exec/s: 55 rss: 68Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:54.233 [2024-12-16 10:55:52.772086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.233 [2024-12-16 10:55:52.772112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.233 [2024-12-16 10:55:52.772173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.233 [2024-12-16 10:55:52.772188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.233 #56 NEW cov: 11825 ft: 14815 corp: 27/282b lim: 25 exec/s: 56 rss: 68Mb L: 14/23 MS: 1 InsertByte- 00:08:54.233 [2024-12-16 10:55:52.812065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.233 [2024-12-16 10:55:52.812093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.233 #57 NEW cov: 11825 ft: 14834 corp: 28/289b lim: 25 exec/s: 57 rss: 68Mb L: 7/23 MS: 1 InsertByte- 00:08:54.233 [2024-12-16 10:55:52.852158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.233 [2024-12-16 10:55:52.852185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.492 #58 NEW cov: 11825 ft: 14838 corp: 29/295b lim: 25 exec/s: 58 rss: 69Mb L: 6/23 MS: 1 EraseBytes- 00:08:54.492 [2024-12-16 10:55:52.892440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.492 [2024-12-16 10:55:52.892467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.492 [2024-12-16 10:55:52.892520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.492 [2024-12-16 10:55:52.892536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.492 #59 NEW cov: 11825 ft: 14879 corp: 30/307b lim: 25 exec/s: 59 rss: 69Mb L: 12/23 MS: 1 InsertByte- 00:08:54.492 [2024-12-16 10:55:52.932539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.492 [2024-12-16 10:55:52.932566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.492 [2024-12-16 10:55:52.932624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.492 [2024-12-16 10:55:52.932641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.492 #60 NEW cov: 11825 ft: 14890 corp: 31/321b lim: 25 exec/s: 60 rss: 69Mb L: 14/23 MS: 1 InsertRepeatedBytes- 00:08:54.492 [2024-12-16 10:55:52.972780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.492 [2024-12-16 10:55:52.972806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.492 [2024-12-16 10:55:52.972853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.492 [2024-12-16 10:55:52.972868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.492 [2024-12-16 10:55:52.972923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.492 [2024-12-16 10:55:52.972938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.492 #61 NEW cov: 11825 ft: 14898 corp: 32/338b lim: 25 exec/s: 61 rss: 69Mb L: 17/23 MS: 1 ChangeBinInt- 00:08:54.492 [2024-12-16 10:55:53.012740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.492 [2024-12-16 10:55:53.012766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.493 [2024-12-16 10:55:53.012820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.493 [2024-12-16 10:55:53.012837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.493 #62 NEW cov: 11825 ft: 14930 corp: 33/348b lim: 25 exec/s: 62 rss: 69Mb L: 10/23 MS: 1 ChangeBinInt- 00:08:54.493 [2024-12-16 10:55:53.052901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.493 [2024-12-16 10:55:53.052926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.493 [2024-12-16 10:55:53.052982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.493 [2024-12-16 10:55:53.052998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.493 #63 NEW cov: 11825 ft: 14940 corp: 34/361b lim: 25 exec/s: 63 rss: 69Mb L: 13/23 MS: 1 CopyPart- 00:08:54.493 [2024-12-16 10:55:53.093109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.493 [2024-12-16 10:55:53.093136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.493 [2024-12-16 10:55:53.093185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.493 [2024-12-16 10:55:53.093200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.493 [2024-12-16 10:55:53.093256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.493 [2024-12-16 10:55:53.093271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.493 #64 NEW cov: 11825 ft: 14950 corp: 35/380b lim: 25 exec/s: 64 rss: 69Mb L: 19/23 MS: 1 InsertRepeatedBytes- 00:08:54.752 [2024-12-16 10:55:53.132974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.133001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.752 #65 NEW cov: 11825 ft: 14960 corp: 36/385b lim: 25 exec/s: 65 rss: 69Mb L: 5/23 MS: 1 ChangeBit- 00:08:54.752 [2024-12-16 10:55:53.173475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.173502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.173552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.752 [2024-12-16 10:55:53.173567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.173641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.752 [2024-12-16 10:55:53.173657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.173712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.752 [2024-12-16 10:55:53.173727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.752 #66 NEW cov: 11825 ft: 14973 corp: 37/405b lim: 25 exec/s: 66 rss: 69Mb L: 20/23 MS: 1 InsertRepeatedBytes- 00:08:54.752 [2024-12-16 10:55:53.213243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.213270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.752 #67 NEW cov: 11825 ft: 14998 corp: 38/412b lim: 25 exec/s: 67 rss: 69Mb L: 7/23 MS: 1 ChangeBinInt- 00:08:54.752 [2024-12-16 10:55:53.253338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.253365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.752 #68 NEW cov: 11825 ft: 15001 corp: 39/419b lim: 25 exec/s: 68 rss: 69Mb L: 7/23 MS: 1 ChangeByte- 00:08:54.752 [2024-12-16 10:55:53.293603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.293633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.293676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.752 [2024-12-16 10:55:53.293691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.752 #69 NEW cov: 11825 ft: 15080 corp: 40/431b lim: 25 exec/s: 69 rss: 69Mb L: 12/23 MS: 1 ChangeBit- 00:08:54.752 [2024-12-16 10:55:53.333914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.333940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.333990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.752 [2024-12-16 10:55:53.334006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.334063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.752 [2024-12-16 10:55:53.334079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.752 [2024-12-16 10:55:53.334137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.752 [2024-12-16 10:55:53.334151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.752 #70 NEW cov: 11825 ft: 15094 corp: 41/455b lim: 25 exec/s: 70 rss: 69Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:54.752 [2024-12-16 10:55:53.373691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.752 [2024-12-16 10:55:53.373719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.012 #71 NEW cov: 11825 ft: 15117 corp: 42/462b lim: 25 exec/s: 71 rss: 69Mb L: 7/24 MS: 1 ChangeBit- 00:08:55.012 [2024-12-16 10:55:53.414009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.012 [2024-12-16 10:55:53.414036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.414076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.012 [2024-12-16 10:55:53.414093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.414148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.012 [2024-12-16 10:55:53.414163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.012 #72 NEW cov: 11825 ft: 15126 corp: 43/478b lim: 25 exec/s: 72 rss: 69Mb L: 16/24 MS: 1 EraseBytes- 00:08:55.012 [2024-12-16 10:55:53.454154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.012 [2024-12-16 10:55:53.454182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.454221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.012 [2024-12-16 10:55:53.454236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.454296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.012 [2024-12-16 10:55:53.454329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.012 #73 NEW cov: 11825 ft: 15201 corp: 44/495b lim: 25 exec/s: 73 rss: 70Mb L: 17/24 MS: 1 ChangeByte- 00:08:55.012 [2024-12-16 10:55:53.494340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.012 [2024-12-16 10:55:53.494368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.494414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.012 [2024-12-16 10:55:53.494431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.494487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.012 [2024-12-16 10:55:53.494502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.494560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.012 [2024-12-16 10:55:53.494577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.534466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.012 [2024-12-16 10:55:53.534493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.534560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.012 [2024-12-16 10:55:53.534576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.534639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.012 [2024-12-16 10:55:53.534654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.012 [2024-12-16 10:55:53.534712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.012 [2024-12-16 10:55:53.534727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.012 #75 NEW cov: 11825 ft: 15204 corp: 45/516b lim: 25 exec/s: 37 rss: 70Mb L: 21/24 MS: 2 InsertRepeatedBytes-ChangeByte- 00:08:55.012 #75 DONE cov: 11825 ft: 15204 corp: 45/516b lim: 25 exec/s: 37 rss: 70Mb 00:08:55.012 Done 75 runs in 2 second(s) 00:08:55.271 10:55:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:55.271 10:55:53 -- ../common.sh@72 -- # (( i++ )) 00:08:55.271 10:55:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.271 10:55:53 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:55.271 10:55:53 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:55.271 10:55:53 -- nvmf/run.sh@24 -- # local timen=1 00:08:55.271 10:55:53 -- nvmf/run.sh@25 -- # local core=0x1 00:08:55.271 10:55:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:55.271 10:55:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:55.271 10:55:53 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:55.271 10:55:53 -- nvmf/run.sh@29 -- # port=4424 00:08:55.271 10:55:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:55.271 10:55:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:55.271 10:55:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:55.271 10:55:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:55.271 [2024-12-16 10:55:53.709286] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:55.271 [2024-12-16 10:55:53.709370] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661669 ] 00:08:55.271 EAL: No free 2048 kB hugepages reported on node 1 00:08:55.530 [2024-12-16 10:55:53.895965] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.530 [2024-12-16 10:55:53.916202] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:55.530 [2024-12-16 10:55:53.916330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.530 [2024-12-16 10:55:53.967843] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:55.530 [2024-12-16 10:55:53.984168] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:55.530 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.530 INFO: Seed: 1273417879 00:08:55.530 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x26bcd8c, 0x2710fd5), 00:08:55.530 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x2710fd8,0x2c53468), 00:08:55.530 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:55.530 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.530 #2 INITED exec/s: 0 rss: 59Mb 00:08:55.530 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.530 This may also happen if the target rejected all inputs we tried so far 00:08:55.530 [2024-12-16 10:55:54.029557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.530 [2024-12-16 10:55:54.029588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.530 [2024-12-16 10:55:54.029649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.530 [2024-12-16 10:55:54.029666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.530 [2024-12-16 10:55:54.029716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.530 [2024-12-16 10:55:54.029732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.530 [2024-12-16 10:55:54.029784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.530 [2024-12-16 10:55:54.029800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.790 NEW_FUNC[1/672]: 0x483c68 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:55.790 NEW_FUNC[2/672]: 0x4948e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:55.790 #11 NEW cov: 11644 ft: 11671 corp: 2/85b lim: 100 exec/s: 0 rss: 66Mb L: 84/84 MS: 4 InsertByte-EraseBytes-CopyPart-InsertRepeatedBytes- 00:08:55.790 [2024-12-16 10:55:54.341261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:134217728 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.341327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.790 [2024-12-16 10:55:54.341483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.341518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.790 [2024-12-16 10:55:54.341666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.341712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.790 #15 NEW cov: 11783 ft: 12716 corp: 3/155b lim: 100 exec/s: 0 rss: 66Mb L: 70/84 MS: 4 ChangeBit-CopyPart-CMP-InsertRepeatedBytes- DE: "\001\000\000\000\000\000\000\000"- 00:08:55.790 [2024-12-16 10:55:54.391345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.391380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.790 [2024-12-16 10:55:54.391480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.391503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.790 [2024-12-16 10:55:54.391624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.391646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.790 [2024-12-16 10:55:54.391761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.790 [2024-12-16 10:55:54.391795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.049 #16 NEW cov: 11789 ft: 12976 corp: 4/254b lim: 100 exec/s: 0 rss: 66Mb L: 99/99 MS: 1 CopyPart- 00:08:56.049 [2024-12-16 10:55:54.441523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.049 [2024-12-16 10:55:54.441555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.049 [2024-12-16 10:55:54.441680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.049 [2024-12-16 10:55:54.441701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.441818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.441839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.441927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.441948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.050 #17 NEW cov: 11874 ft: 13333 corp: 5/353b lim: 100 exec/s: 0 rss: 66Mb L: 99/99 MS: 1 ChangeBit- 00:08:56.050 [2024-12-16 10:55:54.491500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.491533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.491662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.491686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.491802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.491821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.050 #18 NEW cov: 11874 ft: 13429 corp: 6/431b lim: 100 exec/s: 0 rss: 66Mb L: 78/99 MS: 1 EraseBytes- 00:08:56.050 [2024-12-16 10:55:54.531252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.531279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.531395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.531417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.050 #19 NEW cov: 11874 ft: 13819 corp: 7/482b lim: 100 exec/s: 0 rss: 66Mb L: 51/99 MS: 1 CrossOver- 00:08:56.050 [2024-12-16 10:55:54.571850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.571878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.571960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.571985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.572091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.572113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.572233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.572253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.050 #20 NEW cov: 11874 ft: 13877 corp: 8/570b lim: 100 exec/s: 0 rss: 66Mb L: 88/99 MS: 1 EraseBytes- 00:08:56.050 [2024-12-16 10:55:54.611792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:134217728 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.611821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.611941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.611974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.612090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.612110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.050 #21 NEW cov: 11874 ft: 13912 corp: 9/640b lim: 100 exec/s: 0 rss: 66Mb L: 70/99 MS: 1 ChangeBit- 00:08:56.050 [2024-12-16 10:55:54.652146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.652177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.652296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.652321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.652434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.652453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.050 [2024-12-16 10:55:54.652579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.050 [2024-12-16 10:55:54.652601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 #22 NEW cov: 11874 ft: 13964 corp: 10/728b lim: 100 exec/s: 0 rss: 66Mb L: 88/99 MS: 1 ShuffleBytes- 00:08:56.310 [2024-12-16 10:55:54.692459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.692489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.692577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.692596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.692717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.692740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.692851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.692873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.692988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.693007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.310 #23 NEW cov: 11874 ft: 14106 corp: 11/828b lim: 100 exec/s: 0 rss: 66Mb L: 100/100 MS: 1 CrossOver- 00:08:56.310 [2024-12-16 10:55:54.732371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.732402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.732516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.732536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.732669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.732689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.732811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.732839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 #24 NEW cov: 11874 ft: 14145 corp: 12/916b lim: 100 exec/s: 0 rss: 66Mb L: 88/100 MS: 1 CrossOver- 00:08:56.310 [2024-12-16 10:55:54.772582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.772615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.772736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.772759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.772875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.772899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.773016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.773037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 #25 NEW cov: 11874 ft: 14162 corp: 13/1015b lim: 100 exec/s: 0 rss: 66Mb L: 99/100 MS: 1 ChangeByte- 00:08:56.310 [2024-12-16 10:55:54.812604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.812637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.812716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.812737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.812849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.812867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.812984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:88 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.813004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 #26 NEW cov: 11874 ft: 14174 corp: 14/1103b lim: 100 exec/s: 0 rss: 66Mb L: 88/100 MS: 1 ChangeBinInt- 00:08:56.310 [2024-12-16 10:55:54.852749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.852780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.852874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.852891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.853003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:128 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.853023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.853146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.853169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 #27 NEW cov: 11874 ft: 14268 corp: 15/1202b lim: 100 exec/s: 0 rss: 66Mb L: 99/100 MS: 1 ChangeBit- 00:08:56.310 [2024-12-16 10:55:54.893156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.893189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.893267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.893287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.893384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.893406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.893527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.893550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.310 [2024-12-16 10:55:54.893678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:16318464 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.310 [2024-12-16 10:55:54.893699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.310 NEW_FUNC[1/1]: 0x196c168 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:56.310 #28 NEW cov: 11897 ft: 14336 corp: 16/1302b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:56.570 [2024-12-16 10:55:54.943380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.943411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.943478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.943500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.943623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.943644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.943753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.943775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.943900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.943925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.570 #29 NEW cov: 11897 ft: 14353 corp: 17/1402b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 ChangeBit- 00:08:56.570 [2024-12-16 10:55:54.983363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.983402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.983488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.983509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.983630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.983659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.983778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:72057594037927936 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.983797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:54.983918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:54.983935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.570 #30 NEW cov: 11897 ft: 14362 corp: 18/1502b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:56.570 [2024-12-16 10:55:55.023323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.023354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.023445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8589934592 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.023470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.023585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.023605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.023728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.023754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.570 #31 NEW cov: 11897 ft: 14397 corp: 19/1601b lim: 100 exec/s: 31 rss: 67Mb L: 99/100 MS: 1 ChangeBit- 00:08:56.570 [2024-12-16 10:55:55.073412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.073448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.073550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.073573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.073696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.073721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.073845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.073868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.570 #32 NEW cov: 11897 ft: 14435 corp: 20/1690b lim: 100 exec/s: 32 rss: 67Mb L: 89/100 MS: 1 InsertByte- 00:08:56.570 [2024-12-16 10:55:55.123915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.123947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.124044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.124066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.124187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.124213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.124335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:50 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.124355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.124475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.124496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.570 #33 NEW cov: 11897 ft: 14501 corp: 21/1790b lim: 100 exec/s: 33 rss: 67Mb L: 100/100 MS: 1 ChangeByte- 00:08:56.570 [2024-12-16 10:55:55.174102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.174140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.174254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8589934592 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.174293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.174413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.174437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.174556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.174580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.570 [2024-12-16 10:55:55.174704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.570 [2024-12-16 10:55:55.174725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.829 #34 NEW cov: 11897 ft: 14541 corp: 22/1890b lim: 100 exec/s: 34 rss: 67Mb L: 100/100 MS: 1 InsertByte- 00:08:56.829 [2024-12-16 10:55:55.224261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.224296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.224382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.224402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.224513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.224533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.224649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16777216 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.224672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.224798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.224822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:56.829 #35 NEW cov: 11897 ft: 14547 corp: 23/1990b lim: 100 exec/s: 35 rss: 67Mb L: 100/100 MS: 1 CopyPart- 00:08:56.829 [2024-12-16 10:55:55.273675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.273702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.273822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.273845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.829 #36 NEW cov: 11897 ft: 14562 corp: 24/2040b lim: 100 exec/s: 36 rss: 67Mb L: 50/100 MS: 1 EraseBytes- 00:08:56.829 [2024-12-16 10:55:55.324028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:134217728 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.324076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.324170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.324193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.324312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.324332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.829 #37 NEW cov: 11897 ft: 14584 corp: 25/2110b lim: 100 exec/s: 37 rss: 67Mb L: 70/100 MS: 1 CopyPart- 00:08:56.829 [2024-12-16 10:55:55.373936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.373968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.374080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.374102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.829 #38 NEW cov: 11897 ft: 14595 corp: 26/2160b lim: 100 exec/s: 38 rss: 67Mb L: 50/100 MS: 1 ChangeByte- 00:08:56.829 [2024-12-16 10:55:55.424601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.424642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.424765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.424789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.424908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.424932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.829 [2024-12-16 10:55:55.425057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8192 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.829 [2024-12-16 10:55:55.425079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.829 #39 NEW cov: 11897 ft: 14605 corp: 27/2254b lim: 100 exec/s: 39 rss: 67Mb L: 94/100 MS: 1 CopyPart- 00:08:57.088 [2024-12-16 10:55:55.464745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.088 [2024-12-16 10:55:55.464776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.464878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.464901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.465014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.465037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.465161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5997452019571425028 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.465184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.089 #40 NEW cov: 11897 ft: 14619 corp: 28/2353b lim: 100 exec/s: 40 rss: 67Mb L: 99/100 MS: 1 CMP- DE: "\377\004S;:\326\272D"- 00:08:57.089 [2024-12-16 10:55:55.504875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.504907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.504995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.505018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.505132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:131072 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.505155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.505290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5997452019571425028 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.505315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.089 #41 NEW cov: 11897 ft: 14630 corp: 29/2452b lim: 100 exec/s: 41 rss: 67Mb L: 99/100 MS: 1 CMP- DE: "\002\000"- 00:08:57.089 [2024-12-16 10:55:55.554873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.554906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.555029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.555049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.555169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:128 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.555194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.555313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.555336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.089 #42 NEW cov: 11897 ft: 14650 corp: 30/2551b lim: 100 exec/s: 42 rss: 68Mb L: 99/100 MS: 1 ChangeBit- 00:08:57.089 [2024-12-16 10:55:55.594987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.595021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.595128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.595149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.595265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.595287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.595401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:88 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.595423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.089 #43 NEW cov: 11897 ft: 14704 corp: 31/2647b lim: 100 exec/s: 43 rss: 68Mb L: 96/100 MS: 1 PersAutoDict- DE: "\377\004S;:\326\272D"- 00:08:57.089 [2024-12-16 10:55:55.635128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.635163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.635279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.635303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.635421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967296 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.635444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.635566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.635585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.089 #44 NEW cov: 11897 ft: 14727 corp: 32/2734b lim: 100 exec/s: 44 rss: 68Mb L: 87/100 MS: 1 EraseBytes- 00:08:57.089 [2024-12-16 10:55:55.675103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.675132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.675245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:85761906966528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.675269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.089 [2024-12-16 10:55:55.675385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.089 [2024-12-16 10:55:55.675409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.089 #45 NEW cov: 11897 ft: 14746 corp: 33/2812b lim: 100 exec/s: 45 rss: 68Mb L: 78/100 MS: 1 ChangeBinInt- 00:08:57.348 [2024-12-16 10:55:55.715692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.715726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.348 [2024-12-16 10:55:55.715827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.715853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.348 [2024-12-16 10:55:55.715971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.715993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.348 [2024-12-16 10:55:55.716120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5997452019571425028 len:17409 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.716145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.348 [2024-12-16 10:55:55.716270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.716294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:57.348 #51 NEW cov: 11897 ft: 14763 corp: 34/2912b lim: 100 exec/s: 51 rss: 68Mb L: 100/100 MS: 1 InsertByte- 00:08:57.348 [2024-12-16 10:55:55.755562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.755595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.348 [2024-12-16 10:55:55.755691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.348 [2024-12-16 10:55:55.755712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.348 [2024-12-16 10:55:55.755824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.755851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.755966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:88 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.755989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.349 #52 NEW cov: 11897 ft: 14784 corp: 35/3000b lim: 100 exec/s: 52 rss: 68Mb L: 88/100 MS: 1 ShuffleBytes- 00:08:57.349 [2024-12-16 10:55:55.795798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.795827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.795912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.795933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.796045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.796066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.796178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.796200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.796314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.796337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:57.349 #53 NEW cov: 11897 ft: 14788 corp: 36/3100b lim: 100 exec/s: 53 rss: 68Mb L: 100/100 MS: 1 InsertByte- 00:08:57.349 [2024-12-16 10:55:55.835512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:134217728 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.835542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.835641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.835675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.835791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.835810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.349 #54 NEW cov: 11897 ft: 14799 corp: 37/3170b lim: 100 exec/s: 54 rss: 68Mb L: 70/100 MS: 1 ShuffleBytes- 00:08:57.349 [2024-12-16 10:55:55.876122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.876151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.876241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.876265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.876389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.876407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.876520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5997452019571425028 len:17409 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.876541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.876674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.876694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:57.349 #55 NEW cov: 11897 ft: 14879 corp: 38/3270b lim: 100 exec/s: 55 rss: 68Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:57.349 [2024-12-16 10:55:55.916203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.916235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.916325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.916347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.916468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.916489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.916607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9007199254740992 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.916627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.916750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.916775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:57.349 #56 NEW cov: 11897 ft: 14907 corp: 39/3370b lim: 100 exec/s: 56 rss: 68Mb L: 100/100 MS: 1 CrossOver- 00:08:57.349 [2024-12-16 10:55:55.955857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:134217728 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.955888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.956002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.956035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.349 [2024-12-16 10:55:55.956151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.349 [2024-12-16 10:55:55.956175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.607 #57 NEW cov: 11897 ft: 14911 corp: 40/3440b lim: 100 exec/s: 57 rss: 68Mb L: 70/100 MS: 1 ChangeBit- 00:08:57.607 [2024-12-16 10:55:55.995755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.607 [2024-12-16 10:55:55.995790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.607 [2024-12-16 10:55:55.995912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.607 [2024-12-16 10:55:55.995938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.607 #58 NEW cov: 11897 ft: 14917 corp: 41/3490b lim: 100 exec/s: 29 rss: 68Mb L: 50/100 MS: 1 ChangeBit- 00:08:57.607 #58 DONE cov: 11897 ft: 14917 corp: 41/3490b lim: 100 exec/s: 29 rss: 68Mb 00:08:57.608 ###### Recommended dictionary. ###### 00:08:57.608 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:57.608 "\377\004S;:\326\272D" # Uses: 2 00:08:57.608 "\002\000" # Uses: 0 00:08:57.608 ###### End of recommended dictionary. ###### 00:08:57.608 Done 58 runs in 2 second(s) 00:08:57.608 10:55:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:57.608 10:55:56 -- ../common.sh@72 -- # (( i++ )) 00:08:57.608 10:55:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.608 10:55:56 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:57.608 00:08:57.608 real 1m2.366s 00:08:57.608 user 1m39.124s 00:08:57.608 sys 0m6.828s 00:08:57.608 10:55:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:57.608 10:55:56 -- common/autotest_common.sh@10 -- # set +x 00:08:57.608 ************************************ 00:08:57.608 END TEST nvmf_fuzz 00:08:57.608 ************************************ 00:08:57.608 10:55:56 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:57.608 10:55:56 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:57.608 10:55:56 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:57.608 10:55:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:57.608 10:55:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:57.608 10:55:56 -- common/autotest_common.sh@10 -- # set +x 00:08:57.608 ************************************ 00:08:57.608 START TEST vfio_fuzz 00:08:57.608 ************************************ 00:08:57.608 10:55:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:57.869 * Looking for test storage... 00:08:57.869 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:57.869 10:55:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:57.869 10:55:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:57.869 10:55:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:57.869 10:55:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:57.869 10:55:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:57.869 10:55:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:57.869 10:55:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:57.869 10:55:56 -- scripts/common.sh@335 -- # IFS=.-: 00:08:57.869 10:55:56 -- scripts/common.sh@335 -- # read -ra ver1 00:08:57.869 10:55:56 -- scripts/common.sh@336 -- # IFS=.-: 00:08:57.869 10:55:56 -- scripts/common.sh@336 -- # read -ra ver2 00:08:57.869 10:55:56 -- scripts/common.sh@337 -- # local 'op=<' 00:08:57.869 10:55:56 -- scripts/common.sh@339 -- # ver1_l=2 00:08:57.869 10:55:56 -- scripts/common.sh@340 -- # ver2_l=1 00:08:57.869 10:55:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:57.869 10:55:56 -- scripts/common.sh@343 -- # case "$op" in 00:08:57.869 10:55:56 -- scripts/common.sh@344 -- # : 1 00:08:57.869 10:55:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:57.869 10:55:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:57.869 10:55:56 -- scripts/common.sh@364 -- # decimal 1 00:08:57.869 10:55:56 -- scripts/common.sh@352 -- # local d=1 00:08:57.869 10:55:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:57.869 10:55:56 -- scripts/common.sh@354 -- # echo 1 00:08:57.870 10:55:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:57.870 10:55:56 -- scripts/common.sh@365 -- # decimal 2 00:08:57.870 10:55:56 -- scripts/common.sh@352 -- # local d=2 00:08:57.870 10:55:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:57.870 10:55:56 -- scripts/common.sh@354 -- # echo 2 00:08:57.870 10:55:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:57.870 10:55:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:57.870 10:55:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:57.870 10:55:56 -- scripts/common.sh@367 -- # return 0 00:08:57.870 10:55:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:57.870 10:55:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:57.870 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:57.870 --rc genhtml_branch_coverage=1 00:08:57.870 --rc genhtml_function_coverage=1 00:08:57.870 --rc genhtml_legend=1 00:08:57.870 --rc geninfo_all_blocks=1 00:08:57.870 --rc geninfo_unexecuted_blocks=1 00:08:57.870 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:57.870 ' 00:08:57.870 10:55:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:57.870 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:57.870 --rc genhtml_branch_coverage=1 00:08:57.870 --rc genhtml_function_coverage=1 00:08:57.870 --rc genhtml_legend=1 00:08:57.870 --rc geninfo_all_blocks=1 00:08:57.870 --rc geninfo_unexecuted_blocks=1 00:08:57.870 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:57.870 ' 00:08:57.870 10:55:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:57.870 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:57.870 --rc genhtml_branch_coverage=1 00:08:57.870 --rc genhtml_function_coverage=1 00:08:57.870 --rc genhtml_legend=1 00:08:57.870 --rc geninfo_all_blocks=1 00:08:57.870 --rc geninfo_unexecuted_blocks=1 00:08:57.870 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:57.870 ' 00:08:57.870 10:55:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:57.870 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:57.870 --rc genhtml_branch_coverage=1 00:08:57.870 --rc genhtml_function_coverage=1 00:08:57.870 --rc genhtml_legend=1 00:08:57.870 --rc geninfo_all_blocks=1 00:08:57.870 --rc geninfo_unexecuted_blocks=1 00:08:57.870 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:57.870 ' 00:08:57.870 10:55:56 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:57.870 10:55:56 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:57.870 10:55:56 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:57.870 10:55:56 -- common/autotest_common.sh@34 -- # set -e 00:08:57.870 10:55:56 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:57.870 10:55:56 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:57.870 10:55:56 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:57.870 10:55:56 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:57.870 10:55:56 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:57.870 10:55:56 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:57.870 10:55:56 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:57.870 10:55:56 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:57.870 10:55:56 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:57.870 10:55:56 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:57.870 10:55:56 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:57.870 10:55:56 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:57.870 10:55:56 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:57.870 10:55:56 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:57.870 10:55:56 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:57.870 10:55:56 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:57.870 10:55:56 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:57.870 10:55:56 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:57.870 10:55:56 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:57.870 10:55:56 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:57.870 10:55:56 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:57.870 10:55:56 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:57.870 10:55:56 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:57.870 10:55:56 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:57.870 10:55:56 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:57.870 10:55:56 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:57.870 10:55:56 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:57.870 10:55:56 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:57.870 10:55:56 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:57.870 10:55:56 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:57.870 10:55:56 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:57.870 10:55:56 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:57.870 10:55:56 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:57.870 10:55:56 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:57.870 10:55:56 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:57.870 10:55:56 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:57.870 10:55:56 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:57.870 10:55:56 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:57.870 10:55:56 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:57.870 10:55:56 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:57.870 10:55:56 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:57.870 10:55:56 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:57.870 10:55:56 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:57.870 10:55:56 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:57.870 10:55:56 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:57.870 10:55:56 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:57.870 10:55:56 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:57.870 10:55:56 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:57.870 10:55:56 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:57.870 10:55:56 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:57.870 10:55:56 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:57.870 10:55:56 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:57.870 10:55:56 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:57.870 10:55:56 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:57.870 10:55:56 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:57.870 10:55:56 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:57.870 10:55:56 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:57.870 10:55:56 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:57.870 10:55:56 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:57.870 10:55:56 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:57.870 10:55:56 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:57.870 10:55:56 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:57.870 10:55:56 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:57.870 10:55:56 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:57.870 10:55:56 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:57.870 10:55:56 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:57.870 10:55:56 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:57.870 10:55:56 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:57.870 10:55:56 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:57.870 10:55:56 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:57.870 10:55:56 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:57.870 10:55:56 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:57.870 10:55:56 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:57.870 10:55:56 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:57.870 10:55:56 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:57.870 10:55:56 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:57.870 10:55:56 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:57.870 10:55:56 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:57.870 10:55:56 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:57.870 10:55:56 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:57.870 10:55:56 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:57.870 10:55:56 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:57.870 10:55:56 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:57.870 10:55:56 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:57.870 10:55:56 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:57.870 10:55:56 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:57.870 10:55:56 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:57.870 10:55:56 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:57.870 10:55:56 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:57.870 10:55:56 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:57.870 10:55:56 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:57.870 10:55:56 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:57.870 10:55:56 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:57.870 10:55:56 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:57.870 10:55:56 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:57.870 10:55:56 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:57.870 10:55:56 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:57.870 10:55:56 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:57.870 10:55:56 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:57.870 #define SPDK_CONFIG_H 00:08:57.870 #define SPDK_CONFIG_APPS 1 00:08:57.870 #define SPDK_CONFIG_ARCH native 00:08:57.870 #undef SPDK_CONFIG_ASAN 00:08:57.870 #undef SPDK_CONFIG_AVAHI 00:08:57.870 #undef SPDK_CONFIG_CET 00:08:57.871 #define SPDK_CONFIG_COVERAGE 1 00:08:57.871 #define SPDK_CONFIG_CROSS_PREFIX 00:08:57.871 #undef SPDK_CONFIG_CRYPTO 00:08:57.871 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:57.871 #undef SPDK_CONFIG_CUSTOMOCF 00:08:57.871 #undef SPDK_CONFIG_DAOS 00:08:57.871 #define SPDK_CONFIG_DAOS_DIR 00:08:57.871 #define SPDK_CONFIG_DEBUG 1 00:08:57.871 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:57.871 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:57.871 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:57.871 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:57.871 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:57.871 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:57.871 #define SPDK_CONFIG_EXAMPLES 1 00:08:57.871 #undef SPDK_CONFIG_FC 00:08:57.871 #define SPDK_CONFIG_FC_PATH 00:08:57.871 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:57.871 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:57.871 #undef SPDK_CONFIG_FUSE 00:08:57.871 #define SPDK_CONFIG_FUZZER 1 00:08:57.871 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:57.871 #undef SPDK_CONFIG_GOLANG 00:08:57.871 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:57.871 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:57.871 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:57.871 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:57.871 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:57.871 #define SPDK_CONFIG_IDXD 1 00:08:57.871 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:57.871 #undef SPDK_CONFIG_IPSEC_MB 00:08:57.871 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:57.871 #define SPDK_CONFIG_ISAL 1 00:08:57.871 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:57.871 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:57.871 #define SPDK_CONFIG_LIBDIR 00:08:57.871 #undef SPDK_CONFIG_LTO 00:08:57.871 #define SPDK_CONFIG_MAX_LCORES 00:08:57.871 #define SPDK_CONFIG_NVME_CUSE 1 00:08:57.871 #undef SPDK_CONFIG_OCF 00:08:57.871 #define SPDK_CONFIG_OCF_PATH 00:08:57.871 #define SPDK_CONFIG_OPENSSL_PATH 00:08:57.871 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:57.871 #undef SPDK_CONFIG_PGO_USE 00:08:57.871 #define SPDK_CONFIG_PREFIX /usr/local 00:08:57.871 #undef SPDK_CONFIG_RAID5F 00:08:57.871 #undef SPDK_CONFIG_RBD 00:08:57.871 #define SPDK_CONFIG_RDMA 1 00:08:57.871 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:57.871 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:57.871 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:57.871 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:57.871 #undef SPDK_CONFIG_SHARED 00:08:57.871 #undef SPDK_CONFIG_SMA 00:08:57.871 #define SPDK_CONFIG_TESTS 1 00:08:57.871 #undef SPDK_CONFIG_TSAN 00:08:57.871 #define SPDK_CONFIG_UBLK 1 00:08:57.871 #define SPDK_CONFIG_UBSAN 1 00:08:57.871 #undef SPDK_CONFIG_UNIT_TESTS 00:08:57.871 #undef SPDK_CONFIG_URING 00:08:57.871 #define SPDK_CONFIG_URING_PATH 00:08:57.871 #undef SPDK_CONFIG_URING_ZNS 00:08:57.871 #undef SPDK_CONFIG_USDT 00:08:57.871 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:57.871 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:57.871 #define SPDK_CONFIG_VFIO_USER 1 00:08:57.871 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:57.871 #define SPDK_CONFIG_VHOST 1 00:08:57.871 #define SPDK_CONFIG_VIRTIO 1 00:08:57.871 #undef SPDK_CONFIG_VTUNE 00:08:57.871 #define SPDK_CONFIG_VTUNE_DIR 00:08:57.871 #define SPDK_CONFIG_WERROR 1 00:08:57.871 #define SPDK_CONFIG_WPDK_DIR 00:08:57.871 #undef SPDK_CONFIG_XNVME 00:08:57.871 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:57.871 10:55:56 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:57.871 10:55:56 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:57.871 10:55:56 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:57.871 10:55:56 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:57.871 10:55:56 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:57.871 10:55:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:57.871 10:55:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:57.871 10:55:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:57.871 10:55:56 -- paths/export.sh@5 -- # export PATH 00:08:57.871 10:55:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:57.871 10:55:56 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:57.871 10:55:56 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:57.871 10:55:56 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:57.871 10:55:56 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:57.871 10:55:56 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:57.871 10:55:56 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:57.871 10:55:56 -- pm/common@16 -- # TEST_TAG=N/A 00:08:57.871 10:55:56 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:57.871 10:55:56 -- common/autotest_common.sh@52 -- # : 1 00:08:57.871 10:55:56 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:57.871 10:55:56 -- common/autotest_common.sh@56 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:57.871 10:55:56 -- common/autotest_common.sh@58 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:57.871 10:55:56 -- common/autotest_common.sh@60 -- # : 1 00:08:57.871 10:55:56 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:57.871 10:55:56 -- common/autotest_common.sh@62 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:57.871 10:55:56 -- common/autotest_common.sh@64 -- # : 00:08:57.871 10:55:56 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:57.871 10:55:56 -- common/autotest_common.sh@66 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:57.871 10:55:56 -- common/autotest_common.sh@68 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:57.871 10:55:56 -- common/autotest_common.sh@70 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:57.871 10:55:56 -- common/autotest_common.sh@72 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:57.871 10:55:56 -- common/autotest_common.sh@74 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:57.871 10:55:56 -- common/autotest_common.sh@76 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:57.871 10:55:56 -- common/autotest_common.sh@78 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:57.871 10:55:56 -- common/autotest_common.sh@80 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:57.871 10:55:56 -- common/autotest_common.sh@82 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:57.871 10:55:56 -- common/autotest_common.sh@84 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:57.871 10:55:56 -- common/autotest_common.sh@86 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:57.871 10:55:56 -- common/autotest_common.sh@88 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:57.871 10:55:56 -- common/autotest_common.sh@90 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:57.871 10:55:56 -- common/autotest_common.sh@92 -- # : 1 00:08:57.871 10:55:56 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:57.871 10:55:56 -- common/autotest_common.sh@94 -- # : 1 00:08:57.871 10:55:56 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:57.871 10:55:56 -- common/autotest_common.sh@96 -- # : rdma 00:08:57.871 10:55:56 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:57.871 10:55:56 -- common/autotest_common.sh@98 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:57.871 10:55:56 -- common/autotest_common.sh@100 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:57.871 10:55:56 -- common/autotest_common.sh@102 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:57.871 10:55:56 -- common/autotest_common.sh@104 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:57.871 10:55:56 -- common/autotest_common.sh@106 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:57.871 10:55:56 -- common/autotest_common.sh@108 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:57.871 10:55:56 -- common/autotest_common.sh@110 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:57.871 10:55:56 -- common/autotest_common.sh@112 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:57.871 10:55:56 -- common/autotest_common.sh@114 -- # : 0 00:08:57.871 10:55:56 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:57.871 10:55:56 -- common/autotest_common.sh@116 -- # : 1 00:08:57.871 10:55:56 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:57.871 10:55:56 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:57.871 10:55:56 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:57.872 10:55:56 -- common/autotest_common.sh@120 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:57.872 10:55:56 -- common/autotest_common.sh@122 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:57.872 10:55:56 -- common/autotest_common.sh@124 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:57.872 10:55:56 -- common/autotest_common.sh@126 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:57.872 10:55:56 -- common/autotest_common.sh@128 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:57.872 10:55:56 -- common/autotest_common.sh@130 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:57.872 10:55:56 -- common/autotest_common.sh@132 -- # : v23.11 00:08:57.872 10:55:56 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:57.872 10:55:56 -- common/autotest_common.sh@134 -- # : true 00:08:57.872 10:55:56 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:57.872 10:55:56 -- common/autotest_common.sh@136 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:57.872 10:55:56 -- common/autotest_common.sh@138 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:57.872 10:55:56 -- common/autotest_common.sh@140 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:57.872 10:55:56 -- common/autotest_common.sh@142 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:57.872 10:55:56 -- common/autotest_common.sh@144 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:57.872 10:55:56 -- common/autotest_common.sh@146 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:57.872 10:55:56 -- common/autotest_common.sh@148 -- # : 00:08:57.872 10:55:56 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:57.872 10:55:56 -- common/autotest_common.sh@150 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:57.872 10:55:56 -- common/autotest_common.sh@152 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:57.872 10:55:56 -- common/autotest_common.sh@154 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:57.872 10:55:56 -- common/autotest_common.sh@156 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:57.872 10:55:56 -- common/autotest_common.sh@158 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:57.872 10:55:56 -- common/autotest_common.sh@160 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:57.872 10:55:56 -- common/autotest_common.sh@163 -- # : 00:08:57.872 10:55:56 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:57.872 10:55:56 -- common/autotest_common.sh@165 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:57.872 10:55:56 -- common/autotest_common.sh@167 -- # : 0 00:08:57.872 10:55:56 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:57.872 10:55:56 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:57.872 10:55:56 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:57.872 10:55:56 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:57.872 10:55:56 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:57.872 10:55:56 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:57.872 10:55:56 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:57.872 10:55:56 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:57.872 10:55:56 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:57.872 10:55:56 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:57.872 10:55:56 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:57.872 10:55:56 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:57.872 10:55:56 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:57.872 10:55:56 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:57.872 10:55:56 -- common/autotest_common.sh@196 -- # cat 00:08:57.872 10:55:56 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:57.872 10:55:56 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:57.872 10:55:56 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:57.872 10:55:56 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:57.872 10:55:56 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:57.872 10:55:56 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:57.872 10:55:56 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:57.872 10:55:56 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:57.872 10:55:56 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:57.872 10:55:56 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:57.872 10:55:56 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:57.872 10:55:56 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:57.872 10:55:56 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:57.872 10:55:56 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:57.872 10:55:56 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:57.872 10:55:56 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:57.872 10:55:56 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:57.872 10:55:56 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:57.872 10:55:56 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:57.872 10:55:56 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:57.872 10:55:56 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:57.872 10:55:56 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:57.872 10:55:56 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:57.872 10:55:56 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:57.872 10:55:56 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:57.872 10:55:56 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:57.872 10:55:56 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:57.872 10:55:56 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:57.872 10:55:56 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:57.872 10:55:56 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:57.872 10:55:56 -- common/autotest_common.sh@259 -- # valgrind= 00:08:57.872 10:55:56 -- common/autotest_common.sh@265 -- # uname -s 00:08:57.872 10:55:56 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:57.872 10:55:56 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:57.872 10:55:56 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:57.872 10:55:56 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:57.872 10:55:56 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:57.872 10:55:56 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:57.872 10:55:56 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:57.872 10:55:56 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:57.872 10:55:56 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:57.872 10:55:56 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:57.873 10:55:56 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:57.873 10:55:56 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:57.873 10:55:56 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:57.873 10:55:56 -- common/autotest_common.sh@319 -- # [[ -z 662139 ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@319 -- # kill -0 662139 00:08:57.873 10:55:56 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:57.873 10:55:56 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:57.873 10:55:56 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:57.873 10:55:56 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:57.873 10:55:56 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:57.873 10:55:56 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:57.873 10:55:56 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:57.873 10:55:56 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.qNaiSu 00:08:57.873 10:55:56 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:57.873 10:55:56 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.qNaiSu/tests/vfio /tmp/spdk.qNaiSu 00:08:57.873 10:55:56 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@328 -- # df -T 00:08:57.873 10:55:56 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=785162240 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=4499267584 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=52796571648 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730586624 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=8934014976 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864035840 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865293312 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340113408 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=6004736 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=30865104896 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865293312 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=188416 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:57.873 10:55:56 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:57.873 10:55:56 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:57.873 10:55:56 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:57.873 10:55:56 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:57.873 * Looking for test storage... 00:08:57.873 10:55:56 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:57.873 10:55:56 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:57.873 10:55:56 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:57.873 10:55:56 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:57.873 10:55:56 -- common/autotest_common.sh@373 -- # mount=/ 00:08:57.873 10:55:56 -- common/autotest_common.sh@375 -- # target_space=52796571648 00:08:57.873 10:55:56 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:57.873 10:55:56 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:57.873 10:55:56 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@382 -- # new_size=11148607488 00:08:57.873 10:55:56 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:57.873 10:55:56 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:57.873 10:55:56 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:57.873 10:55:56 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:57.873 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:57.873 10:55:56 -- common/autotest_common.sh@390 -- # return 0 00:08:57.873 10:55:56 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:57.873 10:55:56 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:57.873 10:55:56 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:57.873 10:55:56 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:57.873 10:55:56 -- common/autotest_common.sh@1682 -- # true 00:08:57.873 10:55:56 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:57.873 10:55:56 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@27 -- # exec 00:08:57.873 10:55:56 -- common/autotest_common.sh@29 -- # exec 00:08:57.873 10:55:56 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:57.873 10:55:56 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:57.873 10:55:56 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:57.873 10:55:56 -- common/autotest_common.sh@18 -- # set -x 00:08:57.873 10:55:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:57.873 10:55:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:57.873 10:55:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:58.133 10:55:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:58.133 10:55:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:58.133 10:55:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:58.133 10:55:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:58.133 10:55:56 -- scripts/common.sh@335 -- # IFS=.-: 00:08:58.133 10:55:56 -- scripts/common.sh@335 -- # read -ra ver1 00:08:58.133 10:55:56 -- scripts/common.sh@336 -- # IFS=.-: 00:08:58.133 10:55:56 -- scripts/common.sh@336 -- # read -ra ver2 00:08:58.133 10:55:56 -- scripts/common.sh@337 -- # local 'op=<' 00:08:58.133 10:55:56 -- scripts/common.sh@339 -- # ver1_l=2 00:08:58.133 10:55:56 -- scripts/common.sh@340 -- # ver2_l=1 00:08:58.133 10:55:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:58.133 10:55:56 -- scripts/common.sh@343 -- # case "$op" in 00:08:58.133 10:55:56 -- scripts/common.sh@344 -- # : 1 00:08:58.133 10:55:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:58.133 10:55:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:58.133 10:55:56 -- scripts/common.sh@364 -- # decimal 1 00:08:58.133 10:55:56 -- scripts/common.sh@352 -- # local d=1 00:08:58.133 10:55:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:58.133 10:55:56 -- scripts/common.sh@354 -- # echo 1 00:08:58.133 10:55:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:58.133 10:55:56 -- scripts/common.sh@365 -- # decimal 2 00:08:58.133 10:55:56 -- scripts/common.sh@352 -- # local d=2 00:08:58.133 10:55:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:58.133 10:55:56 -- scripts/common.sh@354 -- # echo 2 00:08:58.133 10:55:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:58.133 10:55:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:58.133 10:55:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:58.133 10:55:56 -- scripts/common.sh@367 -- # return 0 00:08:58.133 10:55:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:58.133 10:55:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:58.133 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.133 --rc genhtml_branch_coverage=1 00:08:58.133 --rc genhtml_function_coverage=1 00:08:58.133 --rc genhtml_legend=1 00:08:58.133 --rc geninfo_all_blocks=1 00:08:58.133 --rc geninfo_unexecuted_blocks=1 00:08:58.133 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.133 ' 00:08:58.133 10:55:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:58.133 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.133 --rc genhtml_branch_coverage=1 00:08:58.133 --rc genhtml_function_coverage=1 00:08:58.133 --rc genhtml_legend=1 00:08:58.133 --rc geninfo_all_blocks=1 00:08:58.133 --rc geninfo_unexecuted_blocks=1 00:08:58.133 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.133 ' 00:08:58.133 10:55:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:58.133 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.133 --rc genhtml_branch_coverage=1 00:08:58.133 --rc genhtml_function_coverage=1 00:08:58.133 --rc genhtml_legend=1 00:08:58.133 --rc geninfo_all_blocks=1 00:08:58.133 --rc geninfo_unexecuted_blocks=1 00:08:58.133 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.133 ' 00:08:58.133 10:55:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:58.133 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.133 --rc genhtml_branch_coverage=1 00:08:58.133 --rc genhtml_function_coverage=1 00:08:58.133 --rc genhtml_legend=1 00:08:58.133 --rc geninfo_all_blocks=1 00:08:58.133 --rc geninfo_unexecuted_blocks=1 00:08:58.133 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.133 ' 00:08:58.133 10:55:56 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:58.133 10:55:56 -- ../common.sh@8 -- # pids=() 00:08:58.133 10:55:56 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:58.133 10:55:56 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:58.133 10:55:56 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:58.133 10:55:56 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:58.133 10:55:56 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:58.133 10:55:56 -- vfio/run.sh@65 -- # mem_size=0 00:08:58.133 10:55:56 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:58.133 10:55:56 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:58.133 10:55:56 -- ../common.sh@69 -- # local fuzz_num=7 00:08:58.133 10:55:56 -- ../common.sh@70 -- # local time=1 00:08:58.133 10:55:56 -- ../common.sh@72 -- # (( i = 0 )) 00:08:58.133 10:55:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.133 10:55:56 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:58.133 10:55:56 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:58.133 10:55:56 -- vfio/run.sh@23 -- # local timen=1 00:08:58.133 10:55:56 -- vfio/run.sh@24 -- # local core=0x1 00:08:58.133 10:55:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:58.133 10:55:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:58.133 10:55:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:58.133 10:55:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:58.133 10:55:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:58.134 10:55:56 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:58.134 10:55:56 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:58.134 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.134 10:55:56 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:58.134 [2024-12-16 10:55:56.591787] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:58.134 [2024-12-16 10:55:56.591845] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662234 ] 00:08:58.134 EAL: No free 2048 kB hugepages reported on node 1 00:08:58.134 [2024-12-16 10:55:56.660187] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.134 [2024-12-16 10:55:56.695945] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:58.134 [2024-12-16 10:55:56.696095] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.393 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.393 INFO: Seed: 4150418936 00:08:58.393 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:08:58.393 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:08:58.393 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:58.393 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.393 #2 INITED exec/s: 0 rss: 60Mb 00:08:58.393 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.393 This may also happen if the target rejected all inputs we tried so far 00:08:58.912 NEW_FUNC[1/631]: 0x457c78 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:58.912 NEW_FUNC[2/631]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:58.912 #5 NEW cov: 10763 ft: 10733 corp: 2/50b lim: 60 exec/s: 0 rss: 66Mb L: 49/49 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:59.171 #9 NEW cov: 10789 ft: 14394 corp: 3/104b lim: 60 exec/s: 0 rss: 67Mb L: 54/54 MS: 4 ChangeBit-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:59.431 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:59.431 #10 NEW cov: 10806 ft: 15321 corp: 4/133b lim: 60 exec/s: 0 rss: 68Mb L: 29/54 MS: 1 EraseBytes- 00:08:59.690 #11 NEW cov: 10806 ft: 15382 corp: 5/163b lim: 60 exec/s: 11 rss: 68Mb L: 30/54 MS: 1 InsertByte- 00:08:59.690 #12 NEW cov: 10806 ft: 16035 corp: 6/212b lim: 60 exec/s: 12 rss: 68Mb L: 49/54 MS: 1 ChangeBit- 00:08:59.950 #13 NEW cov: 10806 ft: 16390 corp: 7/241b lim: 60 exec/s: 13 rss: 69Mb L: 29/54 MS: 1 ShuffleBytes- 00:09:00.210 #14 NEW cov: 10806 ft: 16715 corp: 8/289b lim: 60 exec/s: 14 rss: 69Mb L: 48/54 MS: 1 EraseBytes- 00:09:00.469 #20 NEW cov: 10813 ft: 17029 corp: 9/318b lim: 60 exec/s: 20 rss: 69Mb L: 29/54 MS: 1 CopyPart- 00:09:00.469 #21 NEW cov: 10813 ft: 17173 corp: 10/367b lim: 60 exec/s: 10 rss: 69Mb L: 49/54 MS: 1 ChangeBinInt- 00:09:00.469 #21 DONE cov: 10813 ft: 17173 corp: 10/367b lim: 60 exec/s: 10 rss: 69Mb 00:09:00.469 Done 21 runs in 2 second(s) 00:09:00.728 10:55:59 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:09:00.728 10:55:59 -- ../common.sh@72 -- # (( i++ )) 00:09:00.728 10:55:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.728 10:55:59 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:00.728 10:55:59 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:00.729 10:55:59 -- vfio/run.sh@23 -- # local timen=1 00:09:00.729 10:55:59 -- vfio/run.sh@24 -- # local core=0x1 00:09:00.729 10:55:59 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:00.729 10:55:59 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:00.729 10:55:59 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:00.729 10:55:59 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:00.729 10:55:59 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:00.729 10:55:59 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:00.729 10:55:59 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:00.729 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:00.729 10:55:59 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:00.729 [2024-12-16 10:55:59.348056] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:09:00.729 [2024-12-16 10:55:59.348121] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662746 ] 00:09:00.988 EAL: No free 2048 kB hugepages reported on node 1 00:09:00.988 [2024-12-16 10:55:59.417780] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.988 [2024-12-16 10:55:59.452883] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:00.988 [2024-12-16 10:55:59.453037] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.247 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.247 INFO: Seed: 2608460152 00:09:01.247 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:09:01.247 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:09:01.247 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:01.247 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.248 #2 INITED exec/s: 0 rss: 59Mb 00:09:01.248 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.248 This may also happen if the target rejected all inputs we tried so far 00:09:01.248 [2024-12-16 10:55:59.744646] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.248 [2024-12-16 10:55:59.744679] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.248 [2024-12-16 10:55:59.744714] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.507 NEW_FUNC[1/638]: 0x458218 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:09:01.507 NEW_FUNC[2/638]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:01.507 #4 NEW cov: 10782 ft: 10550 corp: 2/29b lim: 40 exec/s: 0 rss: 66Mb L: 28/28 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:01.766 [2024-12-16 10:56:00.208700] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.766 [2024-12-16 10:56:00.208738] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.766 [2024-12-16 10:56:00.208756] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.766 #10 NEW cov: 10796 ft: 12716 corp: 3/65b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:09:02.025 [2024-12-16 10:56:00.403105] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.025 [2024-12-16 10:56:00.403128] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.025 [2024-12-16 10:56:00.403165] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.025 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:02.025 #11 NEW cov: 10813 ft: 14310 corp: 4/105b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:09:02.025 [2024-12-16 10:56:00.597150] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.025 [2024-12-16 10:56:00.597172] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.025 [2024-12-16 10:56:00.597188] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.284 #13 NEW cov: 10813 ft: 15405 corp: 5/110b lim: 40 exec/s: 13 rss: 68Mb L: 5/40 MS: 2 ChangeBit-CrossOver- 00:09:02.284 [2024-12-16 10:56:00.802118] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.284 [2024-12-16 10:56:00.802141] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.285 [2024-12-16 10:56:00.802159] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.545 #19 NEW cov: 10813 ft: 16167 corp: 6/130b lim: 40 exec/s: 19 rss: 68Mb L: 20/40 MS: 1 EraseBytes- 00:09:02.545 [2024-12-16 10:56:00.996057] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.545 [2024-12-16 10:56:00.996078] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.545 [2024-12-16 10:56:00.996096] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.545 #20 NEW cov: 10813 ft: 16489 corp: 7/150b lim: 40 exec/s: 20 rss: 68Mb L: 20/40 MS: 1 ChangeBinInt- 00:09:02.805 [2024-12-16 10:56:01.188225] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.805 [2024-12-16 10:56:01.188247] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.805 [2024-12-16 10:56:01.188265] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.805 #21 NEW cov: 10813 ft: 16787 corp: 8/190b lim: 40 exec/s: 21 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:09:02.805 [2024-12-16 10:56:01.379949] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.805 [2024-12-16 10:56:01.379972] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.805 [2024-12-16 10:56:01.379990] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.063 #22 NEW cov: 10820 ft: 16954 corp: 9/219b lim: 40 exec/s: 22 rss: 68Mb L: 29/40 MS: 1 InsertByte- 00:09:03.063 [2024-12-16 10:56:01.569707] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:03.063 [2024-12-16 10:56:01.569729] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:03.063 [2024-12-16 10:56:01.569745] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.063 #23 NEW cov: 10820 ft: 16996 corp: 10/240b lim: 40 exec/s: 11 rss: 68Mb L: 21/40 MS: 1 InsertByte- 00:09:03.063 #23 DONE cov: 10820 ft: 16996 corp: 10/240b lim: 40 exec/s: 11 rss: 68Mb 00:09:03.063 Done 23 runs in 2 second(s) 00:09:03.322 10:56:01 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:09:03.322 10:56:01 -- ../common.sh@72 -- # (( i++ )) 00:09:03.322 10:56:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.322 10:56:01 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:03.322 10:56:01 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:03.322 10:56:01 -- vfio/run.sh@23 -- # local timen=1 00:09:03.322 10:56:01 -- vfio/run.sh@24 -- # local core=0x1 00:09:03.322 10:56:01 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:03.322 10:56:01 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:03.322 10:56:01 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:03.322 10:56:01 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:03.322 10:56:01 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:03.322 10:56:01 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:03.581 10:56:01 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:03.581 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:03.581 10:56:01 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:03.581 [2024-12-16 10:56:01.979168] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:09:03.581 [2024-12-16 10:56:01.979245] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663291 ] 00:09:03.581 EAL: No free 2048 kB hugepages reported on node 1 00:09:03.581 [2024-12-16 10:56:02.050362] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.581 [2024-12-16 10:56:02.085974] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:03.581 [2024-12-16 10:56:02.086108] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.840 INFO: Running with entropic power schedule (0xFF, 100). 00:09:03.840 INFO: Seed: 939476812 00:09:03.840 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:09:03.840 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:09:03.840 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:03.840 INFO: A corpus is not provided, starting from an empty corpus 00:09:03.840 #2 INITED exec/s: 0 rss: 60Mb 00:09:03.840 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:03.840 This may also happen if the target rejected all inputs we tried so far 00:09:03.840 [2024-12-16 10:56:02.367649] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:03.840 [2024-12-16 10:56:02.367694] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:04.359 NEW_FUNC[1/638]: 0x458c08 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:09:04.359 NEW_FUNC[2/638]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:04.359 #3 NEW cov: 10770 ft: 10709 corp: 2/44b lim: 80 exec/s: 0 rss: 66Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:09:04.359 [2024-12-16 10:56:02.830902] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:04.359 [2024-12-16 10:56:02.830942] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:04.359 #5 NEW cov: 10784 ft: 13343 corp: 3/60b lim: 80 exec/s: 0 rss: 68Mb L: 16/43 MS: 2 InsertByte-InsertRepeatedBytes- 00:09:04.618 [2024-12-16 10:56:03.034769] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:04.618 [2024-12-16 10:56:03.034816] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:04.618 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:04.618 #8 NEW cov: 10801 ft: 14016 corp: 4/108b lim: 80 exec/s: 0 rss: 69Mb L: 48/48 MS: 3 CrossOver-CopyPart-CrossOver- 00:09:04.618 [2024-12-16 10:56:03.229753] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:04.618 [2024-12-16 10:56:03.229783] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:04.877 #9 NEW cov: 10801 ft: 14746 corp: 5/124b lim: 80 exec/s: 9 rss: 69Mb L: 16/48 MS: 1 CrossOver- 00:09:04.877 [2024-12-16 10:56:03.420316] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:04.877 [2024-12-16 10:56:03.420347] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:05.136 #10 NEW cov: 10801 ft: 15168 corp: 6/180b lim: 80 exec/s: 10 rss: 69Mb L: 56/56 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\200"- 00:09:05.136 [2024-12-16 10:56:03.604196] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:05.136 #13 NEW cov: 10802 ft: 15361 corp: 7/188b lim: 80 exec/s: 13 rss: 69Mb L: 8/56 MS: 3 CrossOver-ChangeBit-InsertByte- 00:09:05.396 [2024-12-16 10:56:03.790082] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:05.396 [2024-12-16 10:56:03.790112] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:05.396 #14 NEW cov: 10802 ft: 15592 corp: 8/239b lim: 80 exec/s: 14 rss: 69Mb L: 51/56 MS: 1 CMP- DE: "3>\002\000\000\000\000\000"- 00:09:05.396 [2024-12-16 10:56:03.975256] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:05.396 [2024-12-16 10:56:03.975284] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:05.655 #17 NEW cov: 10802 ft: 15794 corp: 9/248b lim: 80 exec/s: 17 rss: 69Mb L: 9/56 MS: 3 ShuffleBytes-ChangeBit-PersAutoDict- DE: "\001\000\000\000\000\000\000\200"- 00:09:05.655 [2024-12-16 10:56:04.162526] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:05.655 [2024-12-16 10:56:04.162555] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:05.655 #18 NEW cov: 10809 ft: 16028 corp: 10/300b lim: 80 exec/s: 9 rss: 69Mb L: 52/56 MS: 1 InsertByte- 00:09:05.655 #18 DONE cov: 10809 ft: 16028 corp: 10/300b lim: 80 exec/s: 9 rss: 69Mb 00:09:05.655 ###### Recommended dictionary. ###### 00:09:05.655 "\001\000\000\000\000\000\000\200" # Uses: 1 00:09:05.655 "3>\002\000\000\000\000\000" # Uses: 0 00:09:05.655 ###### End of recommended dictionary. ###### 00:09:05.655 Done 18 runs in 2 second(s) 00:09:05.914 10:56:04 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:09:05.914 10:56:04 -- ../common.sh@72 -- # (( i++ )) 00:09:05.914 10:56:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:05.914 10:56:04 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:05.914 10:56:04 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:05.914 10:56:04 -- vfio/run.sh@23 -- # local timen=1 00:09:05.914 10:56:04 -- vfio/run.sh@24 -- # local core=0x1 00:09:05.914 10:56:04 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:05.914 10:56:04 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:05.914 10:56:04 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:05.914 10:56:04 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:05.914 10:56:04 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:05.914 10:56:04 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:06.174 10:56:04 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:06.174 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:06.174 10:56:04 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:06.174 [2024-12-16 10:56:04.570530] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:09:06.174 [2024-12-16 10:56:04.570628] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663715 ] 00:09:06.174 EAL: No free 2048 kB hugepages reported on node 1 00:09:06.174 [2024-12-16 10:56:04.641398] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.174 [2024-12-16 10:56:04.677907] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:06.174 [2024-12-16 10:56:04.678061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.434 INFO: Running with entropic power schedule (0xFF, 100). 00:09:06.434 INFO: Seed: 3537505135 00:09:06.434 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:09:06.434 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:09:06.434 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:06.434 INFO: A corpus is not provided, starting from an empty corpus 00:09:06.434 #2 INITED exec/s: 0 rss: 60Mb 00:09:06.434 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:06.434 This may also happen if the target rejected all inputs we tried so far 00:09:06.434 [2024-12-16 10:56:04.971641] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:09:06.434 [2024-12-16 10:56:04.971685] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:06.434 [2024-12-16 10:56:04.971696] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:06.434 [2024-12-16 10:56:04.971729] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.951 NEW_FUNC[1/638]: 0x4592f8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:09:06.951 NEW_FUNC[2/638]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:06.951 #3 NEW cov: 10780 ft: 10624 corp: 2/125b lim: 320 exec/s: 0 rss: 66Mb L: 124/124 MS: 1 InsertRepeatedBytes- 00:09:06.951 [2024-12-16 10:56:05.437623] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:06.951 [2024-12-16 10:56:05.437672] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:06.951 [2024-12-16 10:56:05.437683] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:06.951 [2024-12-16 10:56:05.437701] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.951 #4 NEW cov: 10794 ft: 13329 corp: 3/217b lim: 320 exec/s: 0 rss: 68Mb L: 92/124 MS: 1 EraseBytes- 00:09:07.211 [2024-12-16 10:56:05.636617] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:07.211 [2024-12-16 10:56:05.636641] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:07.211 [2024-12-16 10:56:05.636652] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:07.211 [2024-12-16 10:56:05.636668] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:07.211 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:07.211 #5 NEW cov: 10811 ft: 14180 corp: 4/343b lim: 320 exec/s: 0 rss: 69Mb L: 126/126 MS: 1 CMP- DE: "\007\000"- 00:09:07.470 [2024-12-16 10:56:05.835626] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x4545454500000000 prot=0x3: Invalid argument 00:09:07.470 [2024-12-16 10:56:05.835650] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x4545454500000000 flags=0x3: Invalid argument 00:09:07.470 [2024-12-16 10:56:05.835660] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:07.470 [2024-12-16 10:56:05.835678] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:07.470 #6 NEW cov: 10811 ft: 14494 corp: 5/490b lim: 320 exec/s: 6 rss: 69Mb L: 147/147 MS: 1 InsertRepeatedBytes- 00:09:07.470 [2024-12-16 10:56:06.034682] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), 0x27000000) fd=325 offset=0 prot=0x3: Permission denied 00:09:07.470 [2024-12-16 10:56:06.034705] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0x27000000) offset=0 flags=0x3: Permission denied 00:09:07.470 [2024-12-16 10:56:06.034717] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:07.470 [2024-12-16 10:56:06.034737] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:07.730 #7 NEW cov: 10811 ft: 15054 corp: 6/616b lim: 320 exec/s: 7 rss: 69Mb L: 126/147 MS: 1 ChangeByte- 00:09:07.730 [2024-12-16 10:56:06.233566] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:09:07.730 [2024-12-16 10:56:06.233591] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:09:07.730 [2024-12-16 10:56:06.233601] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:07.730 [2024-12-16 10:56:06.233626] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:07.730 #8 NEW cov: 10811 ft: 15133 corp: 7/740b lim: 320 exec/s: 8 rss: 69Mb L: 124/147 MS: 1 ChangeByte- 00:09:07.989 [2024-12-16 10:56:06.430915] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 360287970189639680 > max 8796093022208 00:09:07.989 [2024-12-16 10:56:06.430941] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0x500000000000000) offset=0 flags=0x3: No space left on device 00:09:07.989 [2024-12-16 10:56:06.430952] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:09:07.989 [2024-12-16 10:56:06.430969] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:07.989 #9 NEW cov: 10811 ft: 15223 corp: 8/832b lim: 320 exec/s: 9 rss: 69Mb L: 92/147 MS: 1 ChangeBinInt- 00:09:08.248 [2024-12-16 10:56:06.624355] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), 0x700) fd=325 offset=0x27000000000000 prot=0x3: Permission denied 00:09:08.248 [2024-12-16 10:56:06.624379] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0x700) offset=0x27000000000000 flags=0x3: Permission denied 00:09:08.248 [2024-12-16 10:56:06.624391] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:08.248 [2024-12-16 10:56:06.624407] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:08.248 #10 NEW cov: 10818 ft: 15431 corp: 9/1063b lim: 320 exec/s: 10 rss: 69Mb L: 231/231 MS: 1 CopyPart- 00:09:08.248 [2024-12-16 10:56:06.817636] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x4545454500000000 prot=0x3: Invalid argument 00:09:08.248 [2024-12-16 10:56:06.817660] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0x4545454500000000 flags=0x3: Invalid argument 00:09:08.248 [2024-12-16 10:56:06.817670] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:08.248 [2024-12-16 10:56:06.817687] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:08.507 #11 NEW cov: 10818 ft: 15653 corp: 10/1210b lim: 320 exec/s: 5 rss: 69Mb L: 147/231 MS: 1 ChangeBinInt- 00:09:08.508 #11 DONE cov: 10818 ft: 15653 corp: 10/1210b lim: 320 exec/s: 5 rss: 69Mb 00:09:08.508 ###### Recommended dictionary. ###### 00:09:08.508 "\007\000" # Uses: 0 00:09:08.508 ###### End of recommended dictionary. ###### 00:09:08.508 Done 11 runs in 2 second(s) 00:09:08.767 10:56:07 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:09:08.767 10:56:07 -- ../common.sh@72 -- # (( i++ )) 00:09:08.767 10:56:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:08.767 10:56:07 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:08.767 10:56:07 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:08.767 10:56:07 -- vfio/run.sh@23 -- # local timen=1 00:09:08.767 10:56:07 -- vfio/run.sh@24 -- # local core=0x1 00:09:08.767 10:56:07 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:08.767 10:56:07 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:08.767 10:56:07 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:08.767 10:56:07 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:08.767 10:56:07 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:08.767 10:56:07 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:08.767 10:56:07 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:08.767 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:08.767 10:56:07 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:08.767 [2024-12-16 10:56:07.229508] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:09:08.767 [2024-12-16 10:56:07.229599] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid664124 ] 00:09:08.767 EAL: No free 2048 kB hugepages reported on node 1 00:09:08.767 [2024-12-16 10:56:07.300654] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.767 [2024-12-16 10:56:07.336047] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:08.767 [2024-12-16 10:56:07.336181] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.026 INFO: Running with entropic power schedule (0xFF, 100). 00:09:09.026 INFO: Seed: 1900535385 00:09:09.026 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:09:09.026 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:09:09.026 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:09.026 INFO: A corpus is not provided, starting from an empty corpus 00:09:09.026 #2 INITED exec/s: 0 rss: 60Mb 00:09:09.026 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:09.026 This may also happen if the target rejected all inputs we tried so far 00:09:09.545 NEW_FUNC[1/631]: 0x459b78 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:09:09.545 NEW_FUNC[2/631]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:09.545 #15 NEW cov: 10725 ft: 10639 corp: 2/82b lim: 320 exec/s: 0 rss: 67Mb L: 81/81 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:09:09.804 NEW_FUNC[1/1]: 0x46c518 in malloc_completion_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:849 00:09:09.804 #16 NEW cov: 10768 ft: 13628 corp: 3/164b lim: 320 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 CrossOver- 00:09:09.804 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:09.804 #17 NEW cov: 10785 ft: 14456 corp: 4/245b lim: 320 exec/s: 0 rss: 69Mb L: 81/82 MS: 1 ChangeBinInt- 00:09:10.063 #18 NEW cov: 10785 ft: 14794 corp: 5/327b lim: 320 exec/s: 18 rss: 69Mb L: 82/82 MS: 1 InsertByte- 00:09:10.322 #19 NEW cov: 10785 ft: 15094 corp: 6/409b lim: 320 exec/s: 19 rss: 69Mb L: 82/82 MS: 1 InsertByte- 00:09:10.322 #20 NEW cov: 10785 ft: 15623 corp: 7/491b lim: 320 exec/s: 20 rss: 69Mb L: 82/82 MS: 1 ChangeBinInt- 00:09:10.582 #21 NEW cov: 10785 ft: 15820 corp: 8/573b lim: 320 exec/s: 21 rss: 69Mb L: 82/82 MS: 1 ChangeByte- 00:09:10.841 #22 NEW cov: 10785 ft: 15839 corp: 9/742b lim: 320 exec/s: 22 rss: 69Mb L: 169/169 MS: 1 InsertRepeatedBytes- 00:09:11.100 #23 NEW cov: 10792 ft: 16115 corp: 10/846b lim: 320 exec/s: 23 rss: 69Mb L: 104/169 MS: 1 InsertRepeatedBytes- 00:09:11.100 #24 NEW cov: 10792 ft: 16218 corp: 11/927b lim: 320 exec/s: 12 rss: 69Mb L: 81/169 MS: 1 ChangeBit- 00:09:11.100 #24 DONE cov: 10792 ft: 16218 corp: 11/927b lim: 320 exec/s: 12 rss: 69Mb 00:09:11.100 Done 24 runs in 2 second(s) 00:09:11.360 10:56:09 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:09:11.360 10:56:09 -- ../common.sh@72 -- # (( i++ )) 00:09:11.360 10:56:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:11.360 10:56:09 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:11.360 10:56:09 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:11.360 10:56:09 -- vfio/run.sh@23 -- # local timen=1 00:09:11.360 10:56:09 -- vfio/run.sh@24 -- # local core=0x1 00:09:11.360 10:56:09 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:11.360 10:56:09 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:11.360 10:56:09 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:11.360 10:56:09 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:11.360 10:56:09 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:11.360 10:56:09 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:11.360 10:56:09 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:11.360 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:11.360 10:56:09 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:11.360 [2024-12-16 10:56:09.975087] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:09:11.360 [2024-12-16 10:56:09.975163] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid664670 ] 00:09:11.619 EAL: No free 2048 kB hugepages reported on node 1 00:09:11.619 [2024-12-16 10:56:10.046286] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.619 [2024-12-16 10:56:10.084407] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:11.619 [2024-12-16 10:56:10.084546] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.878 INFO: Running with entropic power schedule (0xFF, 100). 00:09:11.878 INFO: Seed: 359566428 00:09:11.878 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:09:11.878 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:09:11.878 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:11.878 INFO: A corpus is not provided, starting from an empty corpus 00:09:11.878 #2 INITED exec/s: 0 rss: 60Mb 00:09:11.878 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:11.878 This may also happen if the target rejected all inputs we tried so far 00:09:11.879 [2024-12-16 10:56:10.350649] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.879 [2024-12-16 10:56:10.350695] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.138 NEW_FUNC[1/638]: 0x45a578 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:09:12.138 NEW_FUNC[2/638]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:12.138 #12 NEW cov: 10784 ft: 10459 corp: 2/110b lim: 120 exec/s: 0 rss: 66Mb L: 109/109 MS: 5 ShuffleBytes-CMP-EraseBytes-CrossOver-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:09:12.138 [2024-12-16 10:56:10.753432] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.138 [2024-12-16 10:56:10.753478] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.398 #13 NEW cov: 10798 ft: 13203 corp: 3/219b lim: 120 exec/s: 0 rss: 68Mb L: 109/109 MS: 1 ChangeByte- 00:09:12.398 [2024-12-16 10:56:10.868422] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.398 [2024-12-16 10:56:10.868458] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.398 #14 NEW cov: 10798 ft: 14016 corp: 4/329b lim: 120 exec/s: 0 rss: 69Mb L: 110/110 MS: 1 InsertByte- 00:09:12.398 [2024-12-16 10:56:10.983437] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.398 [2024-12-16 10:56:10.983471] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.657 #15 NEW cov: 10798 ft: 14602 corp: 5/438b lim: 120 exec/s: 0 rss: 69Mb L: 109/110 MS: 1 ChangeBinInt- 00:09:12.657 [2024-12-16 10:56:11.098390] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.657 [2024-12-16 10:56:11.098423] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.657 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:12.657 #16 NEW cov: 10815 ft: 14688 corp: 6/557b lim: 120 exec/s: 0 rss: 69Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:09:12.657 [2024-12-16 10:56:11.213223] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.657 [2024-12-16 10:56:11.213255] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.657 #17 NEW cov: 10815 ft: 14818 corp: 7/666b lim: 120 exec/s: 0 rss: 69Mb L: 109/119 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:09:12.917 [2024-12-16 10:56:11.318180] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.917 [2024-12-16 10:56:11.318212] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.917 #18 NEW cov: 10815 ft: 15142 corp: 8/781b lim: 120 exec/s: 18 rss: 69Mb L: 115/119 MS: 1 CopyPart- 00:09:12.917 [2024-12-16 10:56:11.432153] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.917 [2024-12-16 10:56:11.432186] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.917 #19 NEW cov: 10815 ft: 15211 corp: 9/900b lim: 120 exec/s: 19 rss: 69Mb L: 119/119 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:09:13.176 [2024-12-16 10:56:11.546995] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.176 [2024-12-16 10:56:11.547028] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.176 #20 NEW cov: 10815 ft: 15216 corp: 10/1009b lim: 120 exec/s: 20 rss: 69Mb L: 109/119 MS: 1 ChangeBinInt- 00:09:13.176 [2024-12-16 10:56:11.651708] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.176 [2024-12-16 10:56:11.651742] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.176 #21 NEW cov: 10815 ft: 15447 corp: 11/1122b lim: 120 exec/s: 21 rss: 69Mb L: 113/119 MS: 1 InsertRepeatedBytes- 00:09:13.176 [2024-12-16 10:56:11.765535] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.176 [2024-12-16 10:56:11.765568] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.436 #22 NEW cov: 10815 ft: 16144 corp: 12/1186b lim: 120 exec/s: 22 rss: 69Mb L: 64/119 MS: 1 InsertRepeatedBytes- 00:09:13.436 [2024-12-16 10:56:11.890451] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.436 [2024-12-16 10:56:11.890484] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.436 #23 NEW cov: 10815 ft: 16163 corp: 13/1266b lim: 120 exec/s: 23 rss: 69Mb L: 80/119 MS: 1 EraseBytes- 00:09:13.436 [2024-12-16 10:56:12.005457] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.436 [2024-12-16 10:56:12.005490] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.695 #24 NEW cov: 10815 ft: 16288 corp: 14/1386b lim: 120 exec/s: 24 rss: 69Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:09:13.695 [2024-12-16 10:56:12.120574] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.695 [2024-12-16 10:56:12.120607] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.695 #25 NEW cov: 10822 ft: 16326 corp: 15/1495b lim: 120 exec/s: 25 rss: 69Mb L: 109/120 MS: 1 ChangeBit- 00:09:13.695 [2024-12-16 10:56:12.225525] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.695 [2024-12-16 10:56:12.225564] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.695 #26 NEW cov: 10822 ft: 16586 corp: 16/1608b lim: 120 exec/s: 13 rss: 69Mb L: 113/120 MS: 1 CMP- DE: "\000\000\000\366"- 00:09:13.695 #26 DONE cov: 10822 ft: 16586 corp: 16/1608b lim: 120 exec/s: 13 rss: 69Mb 00:09:13.695 ###### Recommended dictionary. ###### 00:09:13.695 "\000\000\000\000" # Uses: 1 00:09:13.695 "\001\000\000\000\000\000\000\000" # Uses: 0 00:09:13.695 "\000\000\000\366" # Uses: 0 00:09:13.695 ###### End of recommended dictionary. ###### 00:09:13.695 Done 26 runs in 2 second(s) 00:09:13.954 10:56:12 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:09:13.954 10:56:12 -- ../common.sh@72 -- # (( i++ )) 00:09:13.954 10:56:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:13.954 10:56:12 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:13.954 10:56:12 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:13.954 10:56:12 -- vfio/run.sh@23 -- # local timen=1 00:09:13.954 10:56:12 -- vfio/run.sh@24 -- # local core=0x1 00:09:13.954 10:56:12 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:13.954 10:56:12 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:13.954 10:56:12 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:13.954 10:56:12 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:13.954 10:56:12 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:13.954 10:56:12 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:13.954 10:56:12 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:13.954 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:13.954 10:56:12 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:14.213 [2024-12-16 10:56:12.594350] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:09:14.213 [2024-12-16 10:56:12.594432] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid665211 ] 00:09:14.213 EAL: No free 2048 kB hugepages reported on node 1 00:09:14.213 [2024-12-16 10:56:12.664807] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.213 [2024-12-16 10:56:12.700759] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:14.213 [2024-12-16 10:56:12.700898] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.473 INFO: Running with entropic power schedule (0xFF, 100). 00:09:14.473 INFO: Seed: 2968562363 00:09:14.473 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x267d60c, 0x26d0d8f), 00:09:14.473 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x26d0d90,0x2c085c0), 00:09:14.473 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:14.473 INFO: A corpus is not provided, starting from an empty corpus 00:09:14.473 #2 INITED exec/s: 0 rss: 60Mb 00:09:14.473 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:14.473 This may also happen if the target rejected all inputs we tried so far 00:09:14.473 [2024-12-16 10:56:12.983665] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.473 [2024-12-16 10:56:12.983704] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.048 NEW_FUNC[1/625]: 0x45b268 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:15.048 NEW_FUNC[2/625]: 0x45d818 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:15.048 #8 NEW cov: 10604 ft: 10740 corp: 2/10b lim: 90 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 CMP- DE: "\377\377\377\377\377\377\006m"- 00:09:15.048 [2024-12-16 10:56:13.447683] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.048 [2024-12-16 10:56:13.447724] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.048 NEW_FUNC[1/13]: 0x10da928 in nvmf_check_subsystem_active /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:4511 00:09:15.048 NEW_FUNC[2/13]: 0x1371228 in handle_cmd_rsp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2498 00:09:15.048 #9 NEW cov: 10790 ft: 13364 corp: 3/20b lim: 90 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CopyPart- 00:09:15.048 [2024-12-16 10:56:13.656738] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.048 [2024-12-16 10:56:13.656775] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.378 NEW_FUNC[1/1]: 0x1938838 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:15.378 #10 NEW cov: 10807 ft: 14464 corp: 4/30b lim: 90 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:09:15.378 [2024-12-16 10:56:13.850485] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.378 [2024-12-16 10:56:13.850515] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.378 #11 NEW cov: 10807 ft: 14905 corp: 5/39b lim: 90 exec/s: 11 rss: 68Mb L: 9/10 MS: 1 ShuffleBytes- 00:09:15.655 [2024-12-16 10:56:14.047589] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.655 [2024-12-16 10:56:14.047628] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.655 #12 NEW cov: 10807 ft: 15308 corp: 6/49b lim: 90 exec/s: 12 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:09:15.655 [2024-12-16 10:56:14.246896] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.655 [2024-12-16 10:56:14.246924] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.915 #13 NEW cov: 10807 ft: 15400 corp: 7/59b lim: 90 exec/s: 13 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:09:15.915 [2024-12-16 10:56:14.444983] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.915 [2024-12-16 10:56:14.445012] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.174 #14 NEW cov: 10807 ft: 15997 corp: 8/114b lim: 90 exec/s: 14 rss: 68Mb L: 55/55 MS: 1 InsertRepeatedBytes- 00:09:16.174 [2024-12-16 10:56:14.642386] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.174 [2024-12-16 10:56:14.642417] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.174 #15 NEW cov: 10814 ft: 16390 corp: 9/177b lim: 90 exec/s: 15 rss: 68Mb L: 63/63 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\006m"- 00:09:16.434 [2024-12-16 10:56:14.843293] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.434 [2024-12-16 10:56:14.843322] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.434 #16 pulse cov: 10814 ft: 16474 corp: 9/177b lim: 90 exec/s: 8 rss: 68Mb 00:09:16.434 #16 NEW cov: 10814 ft: 16474 corp: 10/233b lim: 90 exec/s: 8 rss: 68Mb L: 56/63 MS: 1 InsertByte- 00:09:16.434 #16 DONE cov: 10814 ft: 16474 corp: 10/233b lim: 90 exec/s: 8 rss: 68Mb 00:09:16.434 ###### Recommended dictionary. ###### 00:09:16.434 "\377\377\377\377\377\377\006m" # Uses: 1 00:09:16.434 ###### End of recommended dictionary. ###### 00:09:16.434 Done 16 runs in 2 second(s) 00:09:16.692 10:56:15 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:09:16.692 10:56:15 -- ../common.sh@72 -- # (( i++ )) 00:09:16.692 10:56:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:16.692 10:56:15 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:09:16.692 00:09:16.692 real 0m19.039s 00:09:16.692 user 0m26.783s 00:09:16.692 sys 0m1.854s 00:09:16.692 10:56:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:09:16.692 10:56:15 -- common/autotest_common.sh@10 -- # set +x 00:09:16.692 ************************************ 00:09:16.692 END TEST vfio_fuzz 00:09:16.692 ************************************ 00:09:16.692 00:09:16.692 real 1m21.673s 00:09:16.692 user 2m6.010s 00:09:16.692 sys 0m8.877s 00:09:16.692 10:56:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:09:16.692 10:56:15 -- common/autotest_common.sh@10 -- # set +x 00:09:16.692 ************************************ 00:09:16.692 END TEST llvm_fuzz 00:09:16.692 ************************************ 00:09:16.692 10:56:15 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:09:16.693 10:56:15 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:09:16.693 10:56:15 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:09:16.693 10:56:15 -- common/autotest_common.sh@722 -- # xtrace_disable 00:09:16.693 10:56:15 -- common/autotest_common.sh@10 -- # set +x 00:09:16.693 10:56:15 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:09:16.693 10:56:15 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:09:16.693 10:56:15 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:09:16.693 10:56:15 -- common/autotest_common.sh@10 -- # set +x 00:09:23.265 INFO: APP EXITING 00:09:23.265 INFO: killing all VMs 00:09:23.265 INFO: killing vhost app 00:09:23.265 INFO: EXIT DONE 00:09:26.560 Waiting for block devices as requested 00:09:26.560 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:26.560 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:26.560 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:26.560 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:26.560 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:26.560 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:26.819 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:26.819 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:26.819 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:27.078 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:27.078 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:27.078 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:27.338 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:27.338 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:27.338 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:27.597 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:27.597 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:31.792 Cleaning 00:09:31.792 Removing: /dev/shm/spdk_tgt_trace.pid628059 00:09:31.792 Removing: /var/run/dpdk/spdk_pid625549 00:09:31.792 Removing: /var/run/dpdk/spdk_pid626846 00:09:31.792 Removing: /var/run/dpdk/spdk_pid628059 00:09:31.792 Removing: /var/run/dpdk/spdk_pid628855 00:09:31.792 Removing: /var/run/dpdk/spdk_pid629183 00:09:31.792 Removing: /var/run/dpdk/spdk_pid629515 00:09:31.792 Removing: /var/run/dpdk/spdk_pid629856 00:09:31.792 Removing: /var/run/dpdk/spdk_pid630196 00:09:31.792 Removing: /var/run/dpdk/spdk_pid630483 00:09:31.792 Removing: /var/run/dpdk/spdk_pid630765 00:09:31.792 Removing: /var/run/dpdk/spdk_pid631085 00:09:31.792 Removing: /var/run/dpdk/spdk_pid631811 00:09:31.792 Removing: /var/run/dpdk/spdk_pid634976 00:09:31.792 Removing: /var/run/dpdk/spdk_pid635446 00:09:31.792 Removing: /var/run/dpdk/spdk_pid635745 00:09:31.792 Removing: /var/run/dpdk/spdk_pid635762 00:09:31.792 Removing: /var/run/dpdk/spdk_pid636339 00:09:31.792 Removing: /var/run/dpdk/spdk_pid636544 00:09:31.792 Removing: /var/run/dpdk/spdk_pid636918 00:09:31.792 Removing: /var/run/dpdk/spdk_pid637181 00:09:31.792 Removing: /var/run/dpdk/spdk_pid637486 00:09:31.792 Removing: /var/run/dpdk/spdk_pid637598 00:09:31.792 Removing: /var/run/dpdk/spdk_pid637794 00:09:31.792 Removing: /var/run/dpdk/spdk_pid638061 00:09:31.792 Removing: /var/run/dpdk/spdk_pid638444 00:09:31.792 Removing: /var/run/dpdk/spdk_pid638730 00:09:31.792 Removing: /var/run/dpdk/spdk_pid639015 00:09:31.792 Removing: /var/run/dpdk/spdk_pid639244 00:09:31.792 Removing: /var/run/dpdk/spdk_pid639470 00:09:31.792 Removing: /var/run/dpdk/spdk_pid639665 00:09:31.792 Removing: /var/run/dpdk/spdk_pid639727 00:09:31.792 Removing: /var/run/dpdk/spdk_pid639995 00:09:31.792 Removing: /var/run/dpdk/spdk_pid640303 00:09:31.792 Removing: /var/run/dpdk/spdk_pid640558 00:09:31.792 Removing: /var/run/dpdk/spdk_pid640775 00:09:31.792 Removing: /var/run/dpdk/spdk_pid641102 00:09:31.792 Removing: /var/run/dpdk/spdk_pid641663 00:09:31.792 Removing: /var/run/dpdk/spdk_pid641982 00:09:31.792 Removing: /var/run/dpdk/spdk_pid642265 00:09:31.792 Removing: /var/run/dpdk/spdk_pid642535 00:09:31.792 Removing: /var/run/dpdk/spdk_pid642750 00:09:31.792 Removing: /var/run/dpdk/spdk_pid642901 00:09:31.792 Removing: /var/run/dpdk/spdk_pid643125 00:09:31.792 Removing: /var/run/dpdk/spdk_pid643393 00:09:31.792 Removing: /var/run/dpdk/spdk_pid643680 00:09:31.793 Removing: /var/run/dpdk/spdk_pid643948 00:09:31.793 Removing: /var/run/dpdk/spdk_pid644234 00:09:31.793 Removing: /var/run/dpdk/spdk_pid644386 00:09:31.793 Removing: /var/run/dpdk/spdk_pid644561 00:09:31.793 Removing: /var/run/dpdk/spdk_pid644810 00:09:31.793 Removing: /var/run/dpdk/spdk_pid645095 00:09:31.793 Removing: /var/run/dpdk/spdk_pid645367 00:09:31.793 Removing: /var/run/dpdk/spdk_pid645652 00:09:31.793 Removing: /var/run/dpdk/spdk_pid645852 00:09:31.793 Removing: /var/run/dpdk/spdk_pid646037 00:09:31.793 Removing: /var/run/dpdk/spdk_pid646229 00:09:31.793 Removing: /var/run/dpdk/spdk_pid646512 00:09:31.793 Removing: /var/run/dpdk/spdk_pid646778 00:09:31.793 Removing: /var/run/dpdk/spdk_pid647065 00:09:31.793 Removing: /var/run/dpdk/spdk_pid647333 00:09:31.793 Removing: /var/run/dpdk/spdk_pid647526 00:09:31.793 Removing: /var/run/dpdk/spdk_pid647671 00:09:31.793 Removing: /var/run/dpdk/spdk_pid647928 00:09:31.793 Removing: /var/run/dpdk/spdk_pid648201 00:09:31.793 Removing: /var/run/dpdk/spdk_pid648488 00:09:31.793 Removing: /var/run/dpdk/spdk_pid648760 00:09:31.793 Removing: /var/run/dpdk/spdk_pid649053 00:09:31.793 Removing: /var/run/dpdk/spdk_pid649197 00:09:31.793 Removing: /var/run/dpdk/spdk_pid649375 00:09:31.793 Removing: /var/run/dpdk/spdk_pid649628 00:09:31.793 Removing: /var/run/dpdk/spdk_pid649912 00:09:31.793 Removing: /var/run/dpdk/spdk_pid650079 00:09:31.793 Removing: /var/run/dpdk/spdk_pid650328 00:09:31.793 Removing: /var/run/dpdk/spdk_pid651076 00:09:31.793 Removing: /var/run/dpdk/spdk_pid651456 00:09:31.793 Removing: /var/run/dpdk/spdk_pid651910 00:09:31.793 Removing: /var/run/dpdk/spdk_pid652451 00:09:31.793 Removing: /var/run/dpdk/spdk_pid652746 00:09:31.793 Removing: /var/run/dpdk/spdk_pid653283 00:09:31.793 Removing: /var/run/dpdk/spdk_pid653707 00:09:31.793 Removing: /var/run/dpdk/spdk_pid654115 00:09:31.793 Removing: /var/run/dpdk/spdk_pid654648 00:09:31.793 Removing: /var/run/dpdk/spdk_pid654946 00:09:31.793 Removing: /var/run/dpdk/spdk_pid655484 00:09:31.793 Removing: /var/run/dpdk/spdk_pid655917 00:09:31.793 Removing: /var/run/dpdk/spdk_pid656317 00:09:31.793 Removing: /var/run/dpdk/spdk_pid656854 00:09:31.793 Removing: /var/run/dpdk/spdk_pid657165 00:09:31.793 Removing: /var/run/dpdk/spdk_pid657690 00:09:31.793 Removing: /var/run/dpdk/spdk_pid658184 00:09:31.793 Removing: /var/run/dpdk/spdk_pid658525 00:09:31.793 Removing: /var/run/dpdk/spdk_pid659062 00:09:31.793 Removing: /var/run/dpdk/spdk_pid659442 00:09:31.793 Removing: /var/run/dpdk/spdk_pid659899 00:09:31.793 Removing: /var/run/dpdk/spdk_pid660439 00:09:31.793 Removing: /var/run/dpdk/spdk_pid660744 00:09:31.793 Removing: /var/run/dpdk/spdk_pid661277 00:09:31.793 Removing: /var/run/dpdk/spdk_pid661669 00:09:31.793 Removing: /var/run/dpdk/spdk_pid662234 00:09:31.793 Removing: /var/run/dpdk/spdk_pid662746 00:09:31.793 Removing: /var/run/dpdk/spdk_pid663291 00:09:31.793 Removing: /var/run/dpdk/spdk_pid663715 00:09:31.793 Removing: /var/run/dpdk/spdk_pid664124 00:09:31.793 Removing: /var/run/dpdk/spdk_pid664670 00:09:31.793 Removing: /var/run/dpdk/spdk_pid665211 00:09:31.793 Clean 00:09:31.793 killing process with pid 577658 00:09:35.085 killing process with pid 577655 00:09:35.085 killing process with pid 577657 00:09:35.085 killing process with pid 577656 00:09:35.085 10:56:33 -- common/autotest_common.sh@1446 -- # return 0 00:09:35.085 10:56:33 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:35.085 10:56:33 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:35.085 10:56:33 -- common/autotest_common.sh@10 -- # set +x 00:09:35.085 10:56:33 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:35.085 10:56:33 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:35.085 10:56:33 -- common/autotest_common.sh@10 -- # set +x 00:09:35.085 10:56:33 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:35.085 10:56:33 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:35.085 10:56:33 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:35.085 10:56:33 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:35.085 10:56:33 -- spdk/autotest.sh@383 -- # hostname 00:09:35.085 10:56:33 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:35.344 geninfo: WARNING: invalid characters removed from testname! 00:09:36.283 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:36.283 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:36.283 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:46.272 10:56:44 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:54.399 10:56:51 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:57.692 10:56:56 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:02.972 10:57:00 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:07.169 10:57:05 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:12.509 10:57:10 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:16.727 10:57:15 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:16.727 10:57:15 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:10:16.727 10:57:15 -- common/autotest_common.sh@1690 -- $ lcov --version 00:10:16.727 10:57:15 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:10:16.727 10:57:15 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:10:16.727 10:57:15 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:10:16.727 10:57:15 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:10:16.727 10:57:15 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:10:16.727 10:57:15 -- scripts/common.sh@335 -- $ IFS=.-: 00:10:16.727 10:57:15 -- scripts/common.sh@335 -- $ read -ra ver1 00:10:16.728 10:57:15 -- scripts/common.sh@336 -- $ IFS=.-: 00:10:16.728 10:57:15 -- scripts/common.sh@336 -- $ read -ra ver2 00:10:16.728 10:57:15 -- scripts/common.sh@337 -- $ local 'op=<' 00:10:16.728 10:57:15 -- scripts/common.sh@339 -- $ ver1_l=2 00:10:16.728 10:57:15 -- scripts/common.sh@340 -- $ ver2_l=1 00:10:16.728 10:57:15 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:10:16.728 10:57:15 -- scripts/common.sh@343 -- $ case "$op" in 00:10:16.728 10:57:15 -- scripts/common.sh@344 -- $ : 1 00:10:16.728 10:57:15 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:10:16.728 10:57:15 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:16.728 10:57:15 -- scripts/common.sh@364 -- $ decimal 1 00:10:16.728 10:57:15 -- scripts/common.sh@352 -- $ local d=1 00:10:16.728 10:57:15 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:10:16.728 10:57:15 -- scripts/common.sh@354 -- $ echo 1 00:10:16.728 10:57:15 -- scripts/common.sh@364 -- $ ver1[v]=1 00:10:16.728 10:57:15 -- scripts/common.sh@365 -- $ decimal 2 00:10:16.728 10:57:15 -- scripts/common.sh@352 -- $ local d=2 00:10:16.728 10:57:15 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:10:16.728 10:57:15 -- scripts/common.sh@354 -- $ echo 2 00:10:16.728 10:57:15 -- scripts/common.sh@365 -- $ ver2[v]=2 00:10:16.728 10:57:15 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:10:16.728 10:57:15 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:10:16.728 10:57:15 -- scripts/common.sh@367 -- $ return 0 00:10:16.728 10:57:15 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:16.728 10:57:15 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:10:16.728 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.728 --rc genhtml_branch_coverage=1 00:10:16.728 --rc genhtml_function_coverage=1 00:10:16.728 --rc genhtml_legend=1 00:10:16.728 --rc geninfo_all_blocks=1 00:10:16.728 --rc geninfo_unexecuted_blocks=1 00:10:16.728 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.728 ' 00:10:16.728 10:57:15 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:10:16.728 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.728 --rc genhtml_branch_coverage=1 00:10:16.728 --rc genhtml_function_coverage=1 00:10:16.728 --rc genhtml_legend=1 00:10:16.728 --rc geninfo_all_blocks=1 00:10:16.728 --rc geninfo_unexecuted_blocks=1 00:10:16.728 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.728 ' 00:10:16.728 10:57:15 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:10:16.728 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.728 --rc genhtml_branch_coverage=1 00:10:16.728 --rc genhtml_function_coverage=1 00:10:16.728 --rc genhtml_legend=1 00:10:16.728 --rc geninfo_all_blocks=1 00:10:16.728 --rc geninfo_unexecuted_blocks=1 00:10:16.728 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.728 ' 00:10:16.728 10:57:15 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:10:16.728 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.728 --rc genhtml_branch_coverage=1 00:10:16.728 --rc genhtml_function_coverage=1 00:10:16.728 --rc genhtml_legend=1 00:10:16.728 --rc geninfo_all_blocks=1 00:10:16.728 --rc geninfo_unexecuted_blocks=1 00:10:16.728 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.728 ' 00:10:16.728 10:57:15 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:16.728 10:57:15 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:16.728 10:57:15 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:16.728 10:57:15 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:16.728 10:57:15 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.728 10:57:15 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.728 10:57:15 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.728 10:57:15 -- paths/export.sh@5 -- $ export PATH 00:10:16.728 10:57:15 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.728 10:57:15 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:16.729 10:57:15 -- common/autobuild_common.sh@440 -- $ date +%s 00:10:16.729 10:57:15 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1734343035.XXXXXX 00:10:16.729 10:57:15 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1734343035.Ie9G8L 00:10:16.729 10:57:15 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:10:16.729 10:57:15 -- common/autobuild_common.sh@446 -- $ '[' -n v23.11 ']' 00:10:16.729 10:57:15 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:16.729 10:57:15 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:10:16.729 10:57:15 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:16.729 10:57:15 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:16.729 10:57:15 -- common/autobuild_common.sh@456 -- $ get_config_params 00:10:16.729 10:57:15 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:10:16.729 10:57:15 -- common/autotest_common.sh@10 -- $ set +x 00:10:16.729 10:57:15 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:10:16.729 10:57:15 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:10:16.729 10:57:15 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:16.729 10:57:15 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:10:16.729 10:57:15 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:10:16.729 10:57:15 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:10:16.729 10:57:15 -- spdk/autopackage.sh@19 -- $ timing_finish 00:10:16.729 10:57:15 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:16.729 10:57:15 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:10:16.729 10:57:15 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:16.729 10:57:15 -- spdk/autopackage.sh@20 -- $ exit 0 00:10:16.729 + [[ -n 522042 ]] 00:10:16.729 + sudo kill 522042 00:10:16.748 [Pipeline] } 00:10:16.762 [Pipeline] // stage 00:10:16.765 [Pipeline] } 00:10:16.773 [Pipeline] // timeout 00:10:16.776 [Pipeline] } 00:10:16.784 [Pipeline] // catchError 00:10:16.787 [Pipeline] } 00:10:16.796 [Pipeline] // wrap 00:10:16.799 [Pipeline] } 00:10:16.806 [Pipeline] // catchError 00:10:16.811 [Pipeline] stage 00:10:16.812 [Pipeline] { (Epilogue) 00:10:16.820 [Pipeline] catchError 00:10:16.821 [Pipeline] { 00:10:16.829 [Pipeline] echo 00:10:16.830 Cleanup processes 00:10:16.833 [Pipeline] sh 00:10:17.114 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:17.114 675272 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:17.129 [Pipeline] sh 00:10:17.418 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:17.418 ++ grep -v 'sudo pgrep' 00:10:17.418 ++ awk '{print $1}' 00:10:17.418 + sudo kill -9 00:10:17.418 + true 00:10:17.430 [Pipeline] sh 00:10:17.717 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:17.717 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:17.717 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:19.096 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:29.091 [Pipeline] sh 00:10:29.377 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:29.377 Artifacts sizes are good 00:10:29.391 [Pipeline] archiveArtifacts 00:10:29.398 Archiving artifacts 00:10:29.548 [Pipeline] sh 00:10:29.895 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:29.924 [Pipeline] cleanWs 00:10:29.935 [WS-CLEANUP] Deleting project workspace... 00:10:29.935 [WS-CLEANUP] Deferred wipeout is used... 00:10:29.942 [WS-CLEANUP] done 00:10:29.944 [Pipeline] } 00:10:29.961 [Pipeline] // catchError 00:10:29.973 [Pipeline] sh 00:10:30.256 + logger -p user.info -t JENKINS-CI 00:10:30.266 [Pipeline] } 00:10:30.279 [Pipeline] // stage 00:10:30.284 [Pipeline] } 00:10:30.298 [Pipeline] // node 00:10:30.303 [Pipeline] End of Pipeline 00:10:30.362 Finished: SUCCESS