00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v23.11" build number 620 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3286 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.030 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.031 The recommended git tool is: git 00:00:00.031 using credential 00000000-0000-0000-0000-000000000002 00:00:00.033 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.048 Fetching changes from the remote Git repository 00:00:00.051 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.065 Using shallow fetch with depth 1 00:00:00.065 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.065 > git --version # timeout=10 00:00:00.095 > git --version # 'git version 2.39.2' 00:00:00.095 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.141 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.141 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.199 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.209 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.220 Checking out Revision 1c6ed56008363df82da0fcec030d6d5a1f7bd340 (FETCH_HEAD) 00:00:03.220 > git config core.sparsecheckout # timeout=10 00:00:03.229 > git read-tree -mu HEAD # timeout=10 00:00:03.246 > git checkout -f 1c6ed56008363df82da0fcec030d6d5a1f7bd340 # timeout=5 00:00:03.269 Commit message: "spdk-abi-per-patch: pass revision to subbuild" 00:00:03.269 > git rev-list --no-walk 1c6ed56008363df82da0fcec030d6d5a1f7bd340 # timeout=10 00:00:03.382 [Pipeline] Start of Pipeline 00:00:03.398 [Pipeline] library 00:00:03.400 Loading library shm_lib@master 00:00:03.400 Library shm_lib@master is cached. Copying from home. 00:00:03.417 [Pipeline] node 00:00:03.446 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.448 [Pipeline] { 00:00:03.460 [Pipeline] catchError 00:00:03.461 [Pipeline] { 00:00:03.471 [Pipeline] wrap 00:00:03.478 [Pipeline] { 00:00:03.485 [Pipeline] stage 00:00:03.486 [Pipeline] { (Prologue) 00:00:03.661 [Pipeline] sh 00:00:03.942 + logger -p user.info -t JENKINS-CI 00:00:03.964 [Pipeline] echo 00:00:03.965 Node: WFP20 00:00:03.985 [Pipeline] sh 00:00:04.277 [Pipeline] setCustomBuildProperty 00:00:04.289 [Pipeline] echo 00:00:04.291 Cleanup processes 00:00:04.294 [Pipeline] sh 00:00:04.571 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.571 1932860 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.583 [Pipeline] sh 00:00:04.864 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.864 ++ grep -v 'sudo pgrep' 00:00:04.864 ++ awk '{print $1}' 00:00:04.864 + sudo kill -9 00:00:04.864 + true 00:00:04.878 [Pipeline] cleanWs 00:00:04.885 [WS-CLEANUP] Deleting project workspace... 00:00:04.886 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.892 [WS-CLEANUP] done 00:00:04.896 [Pipeline] setCustomBuildProperty 00:00:04.911 [Pipeline] sh 00:00:05.191 + sudo git config --global --replace-all safe.directory '*' 00:00:05.282 [Pipeline] httpRequest 00:00:05.299 [Pipeline] echo 00:00:05.300 Sorcerer 10.211.164.101 is alive 00:00:05.306 [Pipeline] httpRequest 00:00:05.310 HttpMethod: GET 00:00:05.311 URL: http://10.211.164.101/packages/jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:05.311 Sending request to url: http://10.211.164.101/packages/jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:05.324 Response Code: HTTP/1.1 200 OK 00:00:05.325 Success: Status code 200 is in the accepted range: 200,404 00:00:05.325 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:05.775 [Pipeline] sh 00:00:06.052 + tar --no-same-owner -xf jbp_1c6ed56008363df82da0fcec030d6d5a1f7bd340.tar.gz 00:00:06.067 [Pipeline] httpRequest 00:00:06.086 [Pipeline] echo 00:00:06.088 Sorcerer 10.211.164.101 is alive 00:00:06.099 [Pipeline] httpRequest 00:00:06.103 HttpMethod: GET 00:00:06.103 URL: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:06.104 Sending request to url: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:06.109 Response Code: HTTP/1.1 200 OK 00:00:06.110 Success: Status code 200 is in the accepted range: 200,404 00:00:06.110 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:26.473 [Pipeline] sh 00:00:26.754 + tar --no-same-owner -xf spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:29.301 [Pipeline] sh 00:00:29.584 + git -C spdk log --oneline -n5 00:00:29.584 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:00:29.584 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:00:29.584 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:00:29.584 e03c164a1 nvme: add nvme_ctrlr_lock 00:00:29.584 d61f89a86 nvme/cuse: Add ctrlr_lock for cuse register and unregister 00:00:29.602 [Pipeline] withCredentials 00:00:29.611 > git --version # timeout=10 00:00:29.622 > git --version # 'git version 2.39.2' 00:00:29.636 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:29.639 [Pipeline] { 00:00:29.649 [Pipeline] retry 00:00:29.651 [Pipeline] { 00:00:29.669 [Pipeline] sh 00:00:29.950 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:00:30.220 [Pipeline] } 00:00:30.238 [Pipeline] // retry 00:00:30.243 [Pipeline] } 00:00:30.257 [Pipeline] // withCredentials 00:00:30.267 [Pipeline] httpRequest 00:00:30.278 [Pipeline] echo 00:00:30.280 Sorcerer 10.211.164.101 is alive 00:00:30.285 [Pipeline] httpRequest 00:00:30.289 HttpMethod: GET 00:00:30.289 URL: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:30.289 Sending request to url: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:30.291 Response Code: HTTP/1.1 200 OK 00:00:30.291 Success: Status code 200 is in the accepted range: 200,404 00:00:30.292 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:34.996 [Pipeline] sh 00:00:35.278 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:36.664 [Pipeline] sh 00:00:36.945 + git -C dpdk log --oneline -n5 00:00:36.945 eeb0605f11 version: 23.11.0 00:00:36.945 238778122a doc: update release notes for 23.11 00:00:36.945 46aa6b3cfc doc: fix description of RSS features 00:00:36.945 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:00:36.945 7e421ae345 devtools: support skipping forbid rule check 00:00:36.967 [Pipeline] } 00:00:36.984 [Pipeline] // stage 00:00:36.994 [Pipeline] stage 00:00:36.996 [Pipeline] { (Prepare) 00:00:37.025 [Pipeline] writeFile 00:00:37.044 [Pipeline] sh 00:00:37.326 + logger -p user.info -t JENKINS-CI 00:00:37.338 [Pipeline] sh 00:00:37.621 + logger -p user.info -t JENKINS-CI 00:00:37.633 [Pipeline] sh 00:00:37.920 + cat autorun-spdk.conf 00:00:37.920 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:37.920 SPDK_RUN_UBSAN=1 00:00:37.920 SPDK_TEST_FUZZER=1 00:00:37.920 SPDK_TEST_FUZZER_SHORT=1 00:00:37.920 SPDK_TEST_NATIVE_DPDK=v23.11 00:00:37.920 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:37.927 RUN_NIGHTLY=1 00:00:37.932 [Pipeline] readFile 00:00:37.960 [Pipeline] withEnv 00:00:37.963 [Pipeline] { 00:00:37.978 [Pipeline] sh 00:00:38.263 + set -ex 00:00:38.263 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:38.263 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:38.263 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:38.263 ++ SPDK_RUN_UBSAN=1 00:00:38.263 ++ SPDK_TEST_FUZZER=1 00:00:38.263 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:38.263 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:00:38.263 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:38.263 ++ RUN_NIGHTLY=1 00:00:38.263 + case $SPDK_TEST_NVMF_NICS in 00:00:38.263 + DRIVERS= 00:00:38.263 + [[ -n '' ]] 00:00:38.263 + exit 0 00:00:38.272 [Pipeline] } 00:00:38.292 [Pipeline] // withEnv 00:00:38.298 [Pipeline] } 00:00:38.316 [Pipeline] // stage 00:00:38.327 [Pipeline] catchError 00:00:38.329 [Pipeline] { 00:00:38.347 [Pipeline] timeout 00:00:38.347 Timeout set to expire in 30 min 00:00:38.349 [Pipeline] { 00:00:38.365 [Pipeline] stage 00:00:38.367 [Pipeline] { (Tests) 00:00:38.382 [Pipeline] sh 00:00:38.666 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:38.666 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:38.666 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:38.666 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:38.666 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:38.666 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:38.666 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:38.666 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:38.666 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:38.666 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:38.666 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:00:38.666 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:38.666 + source /etc/os-release 00:00:38.666 ++ NAME='Fedora Linux' 00:00:38.666 ++ VERSION='38 (Cloud Edition)' 00:00:38.667 ++ ID=fedora 00:00:38.667 ++ VERSION_ID=38 00:00:38.667 ++ VERSION_CODENAME= 00:00:38.667 ++ PLATFORM_ID=platform:f38 00:00:38.667 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:38.667 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:38.667 ++ LOGO=fedora-logo-icon 00:00:38.667 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:38.667 ++ HOME_URL=https://fedoraproject.org/ 00:00:38.667 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:38.667 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:38.667 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:38.667 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:38.667 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:38.667 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:38.667 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:38.667 ++ SUPPORT_END=2024-05-14 00:00:38.667 ++ VARIANT='Cloud Edition' 00:00:38.667 ++ VARIANT_ID=cloud 00:00:38.667 + uname -a 00:00:38.667 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:38.667 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:41.248 Hugepages 00:00:41.248 node hugesize free / total 00:00:41.248 node0 1048576kB 0 / 0 00:00:41.248 node0 2048kB 0 / 0 00:00:41.248 node1 1048576kB 0 / 0 00:00:41.248 node1 2048kB 0 / 0 00:00:41.248 00:00:41.248 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:41.248 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:41.248 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:41.248 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:00:41.248 + rm -f /tmp/spdk-ld-path 00:00:41.248 + source autorun-spdk.conf 00:00:41.248 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:41.248 ++ SPDK_RUN_UBSAN=1 00:00:41.248 ++ SPDK_TEST_FUZZER=1 00:00:41.248 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:41.248 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:00:41.248 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:41.248 ++ RUN_NIGHTLY=1 00:00:41.248 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:41.248 + [[ -n '' ]] 00:00:41.248 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:41.248 + for M in /var/spdk/build-*-manifest.txt 00:00:41.248 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:41.248 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:41.248 + for M in /var/spdk/build-*-manifest.txt 00:00:41.248 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:41.248 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:41.248 ++ uname 00:00:41.248 + [[ Linux == \L\i\n\u\x ]] 00:00:41.248 + sudo dmesg -T 00:00:41.248 + sudo dmesg --clear 00:00:41.248 + dmesg_pid=1934331 00:00:41.248 + [[ Fedora Linux == FreeBSD ]] 00:00:41.248 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:41.248 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:41.248 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:41.248 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:00:41.248 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:00:41.248 + [[ -x /usr/src/fio-static/fio ]] 00:00:41.248 + export FIO_BIN=/usr/src/fio-static/fio 00:00:41.248 + FIO_BIN=/usr/src/fio-static/fio 00:00:41.248 + sudo dmesg -Tw 00:00:41.248 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:41.248 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:41.248 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:41.248 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:41.248 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:41.248 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:41.248 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:41.248 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:41.248 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:41.248 Test configuration: 00:00:41.248 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:41.248 SPDK_RUN_UBSAN=1 00:00:41.248 SPDK_TEST_FUZZER=1 00:00:41.248 SPDK_TEST_FUZZER_SHORT=1 00:00:41.248 SPDK_TEST_NATIVE_DPDK=v23.11 00:00:41.248 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:41.248 RUN_NIGHTLY=1 11:24:10 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:41.248 11:24:10 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:41.248 11:24:10 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:41.248 11:24:10 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:41.248 11:24:10 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:41.248 11:24:10 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:41.248 11:24:10 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:41.248 11:24:10 -- paths/export.sh@5 -- $ export PATH 00:00:41.248 11:24:10 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:41.248 11:24:10 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:41.248 11:24:10 -- common/autobuild_common.sh@435 -- $ date +%s 00:00:41.248 11:24:10 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1721553850.XXXXXX 00:00:41.248 11:24:10 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1721553850.1u71IQ 00:00:41.248 11:24:10 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:00:41.248 11:24:10 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:00:41.248 11:24:10 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:41.248 11:24:10 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:00:41.248 11:24:10 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:41.248 11:24:10 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:41.248 11:24:10 -- common/autobuild_common.sh@451 -- $ get_config_params 00:00:41.248 11:24:10 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:00:41.248 11:24:10 -- common/autotest_common.sh@10 -- $ set +x 00:00:41.508 11:24:10 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:00:41.508 11:24:10 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:41.508 11:24:10 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:41.508 11:24:10 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:41.508 11:24:10 -- spdk/autobuild.sh@16 -- $ date -u 00:00:41.508 Sun Jul 21 09:24:10 AM UTC 2024 00:00:41.508 11:24:10 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:41.508 LTS-59-g4b94202c6 00:00:41.508 11:24:10 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:41.508 11:24:10 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:41.508 11:24:10 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:41.508 11:24:10 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:00:41.508 11:24:10 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:00:41.508 11:24:10 -- common/autotest_common.sh@10 -- $ set +x 00:00:41.508 ************************************ 00:00:41.508 START TEST ubsan 00:00:41.508 ************************************ 00:00:41.508 11:24:10 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:00:41.508 using ubsan 00:00:41.508 00:00:41.508 real 0m0.000s 00:00:41.508 user 0m0.000s 00:00:41.508 sys 0m0.000s 00:00:41.508 11:24:10 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:41.508 11:24:10 -- common/autotest_common.sh@10 -- $ set +x 00:00:41.508 ************************************ 00:00:41.508 END TEST ubsan 00:00:41.508 ************************************ 00:00:41.508 11:24:10 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:00:41.508 11:24:10 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:00:41.508 11:24:10 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:00:41.508 11:24:10 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:00:41.508 11:24:10 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:00:41.508 11:24:10 -- common/autotest_common.sh@10 -- $ set +x 00:00:41.508 ************************************ 00:00:41.508 START TEST build_native_dpdk 00:00:41.508 ************************************ 00:00:41.508 11:24:10 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:00:41.508 11:24:10 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:00:41.508 11:24:10 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:00:41.508 11:24:10 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:00:41.508 11:24:10 -- common/autobuild_common.sh@51 -- $ local compiler 00:00:41.508 11:24:10 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:00:41.508 11:24:10 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:00:41.508 11:24:10 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:00:41.508 11:24:10 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:00:41.508 11:24:10 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:00:41.508 11:24:10 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:00:41.508 11:24:10 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:00:41.508 11:24:10 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:00:41.508 11:24:10 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:41.508 11:24:10 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:41.508 11:24:10 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:00:41.508 11:24:10 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:41.508 11:24:10 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:00:41.508 eeb0605f11 version: 23.11.0 00:00:41.508 238778122a doc: update release notes for 23.11 00:00:41.508 46aa6b3cfc doc: fix description of RSS features 00:00:41.508 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:00:41.508 7e421ae345 devtools: support skipping forbid rule check 00:00:41.508 11:24:10 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:00:41.508 11:24:10 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:00:41.508 11:24:10 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:00:41.508 11:24:10 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:00:41.508 11:24:10 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:00:41.508 11:24:10 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:00:41.508 11:24:10 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:00:41.508 11:24:10 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:00:41.508 11:24:10 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:00:41.508 11:24:10 -- common/autobuild_common.sh@168 -- $ uname -s 00:00:41.508 11:24:10 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:00:41.508 11:24:10 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:00:41.508 11:24:10 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:00:41.508 11:24:10 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:00:41.508 11:24:10 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:00:41.508 11:24:10 -- scripts/common.sh@335 -- $ IFS=.-: 00:00:41.508 11:24:10 -- scripts/common.sh@335 -- $ read -ra ver1 00:00:41.508 11:24:10 -- scripts/common.sh@336 -- $ IFS=.-: 00:00:41.508 11:24:10 -- scripts/common.sh@336 -- $ read -ra ver2 00:00:41.508 11:24:10 -- scripts/common.sh@337 -- $ local 'op=<' 00:00:41.508 11:24:10 -- scripts/common.sh@339 -- $ ver1_l=3 00:00:41.508 11:24:10 -- scripts/common.sh@340 -- $ ver2_l=3 00:00:41.508 11:24:10 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:00:41.508 11:24:10 -- scripts/common.sh@343 -- $ case "$op" in 00:00:41.508 11:24:10 -- scripts/common.sh@344 -- $ : 1 00:00:41.508 11:24:10 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:00:41.508 11:24:10 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:00:41.508 11:24:10 -- scripts/common.sh@364 -- $ decimal 23 00:00:41.508 11:24:10 -- scripts/common.sh@352 -- $ local d=23 00:00:41.508 11:24:10 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:00:41.508 11:24:10 -- scripts/common.sh@354 -- $ echo 23 00:00:41.508 11:24:10 -- scripts/common.sh@364 -- $ ver1[v]=23 00:00:41.508 11:24:10 -- scripts/common.sh@365 -- $ decimal 21 00:00:41.508 11:24:10 -- scripts/common.sh@352 -- $ local d=21 00:00:41.508 11:24:10 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:00:41.508 11:24:10 -- scripts/common.sh@354 -- $ echo 21 00:00:41.508 11:24:10 -- scripts/common.sh@365 -- $ ver2[v]=21 00:00:41.508 11:24:10 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:00:41.508 11:24:10 -- scripts/common.sh@366 -- $ return 1 00:00:41.508 11:24:10 -- common/autobuild_common.sh@173 -- $ patch -p1 00:00:41.508 patching file config/rte_config.h 00:00:41.508 Hunk #1 succeeded at 60 (offset 1 line). 00:00:41.508 11:24:10 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:00:41.508 11:24:10 -- common/autobuild_common.sh@178 -- $ uname -s 00:00:41.508 11:24:10 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:00:41.508 11:24:10 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:00:41.508 11:24:10 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:00:46.776 The Meson build system 00:00:46.776 Version: 1.3.1 00:00:46.776 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:00:46.776 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:00:46.776 Build type: native build 00:00:46.776 Program cat found: YES (/usr/bin/cat) 00:00:46.776 Project name: DPDK 00:00:46.776 Project version: 23.11.0 00:00:46.776 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:00:46.776 C linker for the host machine: gcc ld.bfd 2.39-16 00:00:46.776 Host machine cpu family: x86_64 00:00:46.776 Host machine cpu: x86_64 00:00:46.776 Message: ## Building in Developer Mode ## 00:00:46.776 Program pkg-config found: YES (/usr/bin/pkg-config) 00:00:46.776 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:00:46.776 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:00:46.776 Program python3 found: YES (/usr/bin/python3) 00:00:46.776 Program cat found: YES (/usr/bin/cat) 00:00:46.776 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:00:46.776 Compiler for C supports arguments -march=native: YES 00:00:46.776 Checking for size of "void *" : 8 00:00:46.776 Checking for size of "void *" : 8 (cached) 00:00:46.776 Library m found: YES 00:00:46.776 Library numa found: YES 00:00:46.776 Has header "numaif.h" : YES 00:00:46.776 Library fdt found: NO 00:00:46.776 Library execinfo found: NO 00:00:46.776 Has header "execinfo.h" : YES 00:00:46.776 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:00:46.776 Run-time dependency libarchive found: NO (tried pkgconfig) 00:00:46.776 Run-time dependency libbsd found: NO (tried pkgconfig) 00:00:46.776 Run-time dependency jansson found: NO (tried pkgconfig) 00:00:46.776 Run-time dependency openssl found: YES 3.0.9 00:00:46.776 Run-time dependency libpcap found: YES 1.10.4 00:00:46.776 Has header "pcap.h" with dependency libpcap: YES 00:00:46.776 Compiler for C supports arguments -Wcast-qual: YES 00:00:46.776 Compiler for C supports arguments -Wdeprecated: YES 00:00:46.776 Compiler for C supports arguments -Wformat: YES 00:00:46.776 Compiler for C supports arguments -Wformat-nonliteral: NO 00:00:46.776 Compiler for C supports arguments -Wformat-security: NO 00:00:46.776 Compiler for C supports arguments -Wmissing-declarations: YES 00:00:46.776 Compiler for C supports arguments -Wmissing-prototypes: YES 00:00:46.776 Compiler for C supports arguments -Wnested-externs: YES 00:00:46.776 Compiler for C supports arguments -Wold-style-definition: YES 00:00:46.776 Compiler for C supports arguments -Wpointer-arith: YES 00:00:46.776 Compiler for C supports arguments -Wsign-compare: YES 00:00:46.776 Compiler for C supports arguments -Wstrict-prototypes: YES 00:00:46.776 Compiler for C supports arguments -Wundef: YES 00:00:46.776 Compiler for C supports arguments -Wwrite-strings: YES 00:00:46.776 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:00:46.776 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:00:46.776 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:00:46.776 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:00:46.776 Program objdump found: YES (/usr/bin/objdump) 00:00:46.776 Compiler for C supports arguments -mavx512f: YES 00:00:46.776 Checking if "AVX512 checking" compiles: YES 00:00:46.776 Fetching value of define "__SSE4_2__" : 1 00:00:46.776 Fetching value of define "__AES__" : 1 00:00:46.776 Fetching value of define "__AVX__" : 1 00:00:46.776 Fetching value of define "__AVX2__" : 1 00:00:46.776 Fetching value of define "__AVX512BW__" : 1 00:00:46.776 Fetching value of define "__AVX512CD__" : 1 00:00:46.776 Fetching value of define "__AVX512DQ__" : 1 00:00:46.776 Fetching value of define "__AVX512F__" : 1 00:00:46.776 Fetching value of define "__AVX512VL__" : 1 00:00:46.776 Fetching value of define "__PCLMUL__" : 1 00:00:46.776 Fetching value of define "__RDRND__" : 1 00:00:46.776 Fetching value of define "__RDSEED__" : 1 00:00:46.776 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:00:46.776 Fetching value of define "__znver1__" : (undefined) 00:00:46.776 Fetching value of define "__znver2__" : (undefined) 00:00:46.776 Fetching value of define "__znver3__" : (undefined) 00:00:46.776 Fetching value of define "__znver4__" : (undefined) 00:00:46.776 Compiler for C supports arguments -Wno-format-truncation: YES 00:00:46.776 Message: lib/log: Defining dependency "log" 00:00:46.776 Message: lib/kvargs: Defining dependency "kvargs" 00:00:46.776 Message: lib/telemetry: Defining dependency "telemetry" 00:00:46.776 Checking for function "getentropy" : NO 00:00:46.776 Message: lib/eal: Defining dependency "eal" 00:00:46.776 Message: lib/ring: Defining dependency "ring" 00:00:46.776 Message: lib/rcu: Defining dependency "rcu" 00:00:46.777 Message: lib/mempool: Defining dependency "mempool" 00:00:46.777 Message: lib/mbuf: Defining dependency "mbuf" 00:00:46.777 Fetching value of define "__PCLMUL__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512F__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512BW__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512VL__" : 1 (cached) 00:00:46.777 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:00:46.777 Compiler for C supports arguments -mpclmul: YES 00:00:46.777 Compiler for C supports arguments -maes: YES 00:00:46.777 Compiler for C supports arguments -mavx512f: YES (cached) 00:00:46.777 Compiler for C supports arguments -mavx512bw: YES 00:00:46.777 Compiler for C supports arguments -mavx512dq: YES 00:00:46.777 Compiler for C supports arguments -mavx512vl: YES 00:00:46.777 Compiler for C supports arguments -mvpclmulqdq: YES 00:00:46.777 Compiler for C supports arguments -mavx2: YES 00:00:46.777 Compiler for C supports arguments -mavx: YES 00:00:46.777 Message: lib/net: Defining dependency "net" 00:00:46.777 Message: lib/meter: Defining dependency "meter" 00:00:46.777 Message: lib/ethdev: Defining dependency "ethdev" 00:00:46.777 Message: lib/pci: Defining dependency "pci" 00:00:46.777 Message: lib/cmdline: Defining dependency "cmdline" 00:00:46.777 Message: lib/metrics: Defining dependency "metrics" 00:00:46.777 Message: lib/hash: Defining dependency "hash" 00:00:46.777 Message: lib/timer: Defining dependency "timer" 00:00:46.777 Fetching value of define "__AVX512F__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512VL__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512CD__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512BW__" : 1 (cached) 00:00:46.777 Message: lib/acl: Defining dependency "acl" 00:00:46.777 Message: lib/bbdev: Defining dependency "bbdev" 00:00:46.777 Message: lib/bitratestats: Defining dependency "bitratestats" 00:00:46.777 Run-time dependency libelf found: YES 0.190 00:00:46.777 Message: lib/bpf: Defining dependency "bpf" 00:00:46.777 Message: lib/cfgfile: Defining dependency "cfgfile" 00:00:46.777 Message: lib/compressdev: Defining dependency "compressdev" 00:00:46.777 Message: lib/cryptodev: Defining dependency "cryptodev" 00:00:46.777 Message: lib/distributor: Defining dependency "distributor" 00:00:46.777 Message: lib/dmadev: Defining dependency "dmadev" 00:00:46.777 Message: lib/efd: Defining dependency "efd" 00:00:46.777 Message: lib/eventdev: Defining dependency "eventdev" 00:00:46.777 Message: lib/dispatcher: Defining dependency "dispatcher" 00:00:46.777 Message: lib/gpudev: Defining dependency "gpudev" 00:00:46.777 Message: lib/gro: Defining dependency "gro" 00:00:46.777 Message: lib/gso: Defining dependency "gso" 00:00:46.777 Message: lib/ip_frag: Defining dependency "ip_frag" 00:00:46.777 Message: lib/jobstats: Defining dependency "jobstats" 00:00:46.777 Message: lib/latencystats: Defining dependency "latencystats" 00:00:46.777 Message: lib/lpm: Defining dependency "lpm" 00:00:46.777 Fetching value of define "__AVX512F__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512IFMA__" : (undefined) 00:00:46.777 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:00:46.777 Message: lib/member: Defining dependency "member" 00:00:46.777 Message: lib/pcapng: Defining dependency "pcapng" 00:00:46.777 Compiler for C supports arguments -Wno-cast-qual: YES 00:00:46.777 Message: lib/power: Defining dependency "power" 00:00:46.777 Message: lib/rawdev: Defining dependency "rawdev" 00:00:46.777 Message: lib/regexdev: Defining dependency "regexdev" 00:00:46.777 Message: lib/mldev: Defining dependency "mldev" 00:00:46.777 Message: lib/rib: Defining dependency "rib" 00:00:46.777 Message: lib/reorder: Defining dependency "reorder" 00:00:46.777 Message: lib/sched: Defining dependency "sched" 00:00:46.777 Message: lib/security: Defining dependency "security" 00:00:46.777 Message: lib/stack: Defining dependency "stack" 00:00:46.777 Has header "linux/userfaultfd.h" : YES 00:00:46.777 Has header "linux/vduse.h" : YES 00:00:46.777 Message: lib/vhost: Defining dependency "vhost" 00:00:46.777 Message: lib/ipsec: Defining dependency "ipsec" 00:00:46.777 Message: lib/pdcp: Defining dependency "pdcp" 00:00:46.777 Fetching value of define "__AVX512F__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:00:46.777 Fetching value of define "__AVX512BW__" : 1 (cached) 00:00:46.777 Message: lib/fib: Defining dependency "fib" 00:00:46.777 Message: lib/port: Defining dependency "port" 00:00:46.777 Message: lib/pdump: Defining dependency "pdump" 00:00:46.777 Message: lib/table: Defining dependency "table" 00:00:46.777 Message: lib/pipeline: Defining dependency "pipeline" 00:00:46.777 Message: lib/graph: Defining dependency "graph" 00:00:46.777 Message: lib/node: Defining dependency "node" 00:00:46.777 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:00:47.345 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:00:47.345 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:00:47.345 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:00:47.345 Compiler for C supports arguments -Wno-sign-compare: YES 00:00:47.345 Compiler for C supports arguments -Wno-unused-value: YES 00:00:47.345 Compiler for C supports arguments -Wno-format: YES 00:00:47.345 Compiler for C supports arguments -Wno-format-security: YES 00:00:47.345 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:00:47.345 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:00:47.345 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:00:47.345 Compiler for C supports arguments -Wno-unused-parameter: YES 00:00:47.345 Fetching value of define "__AVX512F__" : 1 (cached) 00:00:47.345 Fetching value of define "__AVX512BW__" : 1 (cached) 00:00:47.345 Compiler for C supports arguments -mavx512f: YES (cached) 00:00:47.345 Compiler for C supports arguments -mavx512bw: YES (cached) 00:00:47.345 Compiler for C supports arguments -march=skylake-avx512: YES 00:00:47.345 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:00:47.345 Has header "sys/epoll.h" : YES 00:00:47.345 Program doxygen found: YES (/usr/bin/doxygen) 00:00:47.345 Configuring doxy-api-html.conf using configuration 00:00:47.345 Configuring doxy-api-man.conf using configuration 00:00:47.345 Program mandb found: YES (/usr/bin/mandb) 00:00:47.345 Program sphinx-build found: NO 00:00:47.345 Configuring rte_build_config.h using configuration 00:00:47.345 Message: 00:00:47.345 ================= 00:00:47.345 Applications Enabled 00:00:47.345 ================= 00:00:47.345 00:00:47.345 apps: 00:00:47.345 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:00:47.345 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:00:47.345 test-pmd, test-regex, test-sad, test-security-perf, 00:00:47.345 00:00:47.345 Message: 00:00:47.345 ================= 00:00:47.345 Libraries Enabled 00:00:47.345 ================= 00:00:47.345 00:00:47.345 libs: 00:00:47.345 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:00:47.345 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:00:47.345 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:00:47.345 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:00:47.345 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:00:47.345 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:00:47.345 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:00:47.345 00:00:47.345 00:00:47.345 Message: 00:00:47.345 =============== 00:00:47.345 Drivers Enabled 00:00:47.345 =============== 00:00:47.345 00:00:47.345 common: 00:00:47.345 00:00:47.345 bus: 00:00:47.345 pci, vdev, 00:00:47.345 mempool: 00:00:47.345 ring, 00:00:47.345 dma: 00:00:47.345 00:00:47.345 net: 00:00:47.345 i40e, 00:00:47.345 raw: 00:00:47.345 00:00:47.345 crypto: 00:00:47.345 00:00:47.346 compress: 00:00:47.346 00:00:47.346 regex: 00:00:47.346 00:00:47.346 ml: 00:00:47.346 00:00:47.346 vdpa: 00:00:47.346 00:00:47.346 event: 00:00:47.346 00:00:47.346 baseband: 00:00:47.346 00:00:47.346 gpu: 00:00:47.346 00:00:47.346 00:00:47.346 Message: 00:00:47.346 ================= 00:00:47.346 Content Skipped 00:00:47.346 ================= 00:00:47.346 00:00:47.346 apps: 00:00:47.346 00:00:47.346 libs: 00:00:47.346 00:00:47.346 drivers: 00:00:47.346 common/cpt: not in enabled drivers build config 00:00:47.346 common/dpaax: not in enabled drivers build config 00:00:47.346 common/iavf: not in enabled drivers build config 00:00:47.346 common/idpf: not in enabled drivers build config 00:00:47.346 common/mvep: not in enabled drivers build config 00:00:47.346 common/octeontx: not in enabled drivers build config 00:00:47.346 bus/auxiliary: not in enabled drivers build config 00:00:47.346 bus/cdx: not in enabled drivers build config 00:00:47.346 bus/dpaa: not in enabled drivers build config 00:00:47.346 bus/fslmc: not in enabled drivers build config 00:00:47.346 bus/ifpga: not in enabled drivers build config 00:00:47.346 bus/platform: not in enabled drivers build config 00:00:47.346 bus/vmbus: not in enabled drivers build config 00:00:47.346 common/cnxk: not in enabled drivers build config 00:00:47.346 common/mlx5: not in enabled drivers build config 00:00:47.346 common/nfp: not in enabled drivers build config 00:00:47.346 common/qat: not in enabled drivers build config 00:00:47.346 common/sfc_efx: not in enabled drivers build config 00:00:47.346 mempool/bucket: not in enabled drivers build config 00:00:47.346 mempool/cnxk: not in enabled drivers build config 00:00:47.346 mempool/dpaa: not in enabled drivers build config 00:00:47.346 mempool/dpaa2: not in enabled drivers build config 00:00:47.346 mempool/octeontx: not in enabled drivers build config 00:00:47.346 mempool/stack: not in enabled drivers build config 00:00:47.346 dma/cnxk: not in enabled drivers build config 00:00:47.346 dma/dpaa: not in enabled drivers build config 00:00:47.346 dma/dpaa2: not in enabled drivers build config 00:00:47.346 dma/hisilicon: not in enabled drivers build config 00:00:47.346 dma/idxd: not in enabled drivers build config 00:00:47.346 dma/ioat: not in enabled drivers build config 00:00:47.346 dma/skeleton: not in enabled drivers build config 00:00:47.346 net/af_packet: not in enabled drivers build config 00:00:47.346 net/af_xdp: not in enabled drivers build config 00:00:47.346 net/ark: not in enabled drivers build config 00:00:47.346 net/atlantic: not in enabled drivers build config 00:00:47.346 net/avp: not in enabled drivers build config 00:00:47.346 net/axgbe: not in enabled drivers build config 00:00:47.346 net/bnx2x: not in enabled drivers build config 00:00:47.346 net/bnxt: not in enabled drivers build config 00:00:47.346 net/bonding: not in enabled drivers build config 00:00:47.346 net/cnxk: not in enabled drivers build config 00:00:47.346 net/cpfl: not in enabled drivers build config 00:00:47.346 net/cxgbe: not in enabled drivers build config 00:00:47.346 net/dpaa: not in enabled drivers build config 00:00:47.346 net/dpaa2: not in enabled drivers build config 00:00:47.346 net/e1000: not in enabled drivers build config 00:00:47.346 net/ena: not in enabled drivers build config 00:00:47.346 net/enetc: not in enabled drivers build config 00:00:47.346 net/enetfec: not in enabled drivers build config 00:00:47.346 net/enic: not in enabled drivers build config 00:00:47.346 net/failsafe: not in enabled drivers build config 00:00:47.346 net/fm10k: not in enabled drivers build config 00:00:47.346 net/gve: not in enabled drivers build config 00:00:47.346 net/hinic: not in enabled drivers build config 00:00:47.346 net/hns3: not in enabled drivers build config 00:00:47.346 net/iavf: not in enabled drivers build config 00:00:47.346 net/ice: not in enabled drivers build config 00:00:47.346 net/idpf: not in enabled drivers build config 00:00:47.346 net/igc: not in enabled drivers build config 00:00:47.346 net/ionic: not in enabled drivers build config 00:00:47.346 net/ipn3ke: not in enabled drivers build config 00:00:47.346 net/ixgbe: not in enabled drivers build config 00:00:47.346 net/mana: not in enabled drivers build config 00:00:47.346 net/memif: not in enabled drivers build config 00:00:47.346 net/mlx4: not in enabled drivers build config 00:00:47.346 net/mlx5: not in enabled drivers build config 00:00:47.346 net/mvneta: not in enabled drivers build config 00:00:47.346 net/mvpp2: not in enabled drivers build config 00:00:47.346 net/netvsc: not in enabled drivers build config 00:00:47.346 net/nfb: not in enabled drivers build config 00:00:47.346 net/nfp: not in enabled drivers build config 00:00:47.346 net/ngbe: not in enabled drivers build config 00:00:47.346 net/null: not in enabled drivers build config 00:00:47.346 net/octeontx: not in enabled drivers build config 00:00:47.346 net/octeon_ep: not in enabled drivers build config 00:00:47.346 net/pcap: not in enabled drivers build config 00:00:47.346 net/pfe: not in enabled drivers build config 00:00:47.346 net/qede: not in enabled drivers build config 00:00:47.346 net/ring: not in enabled drivers build config 00:00:47.346 net/sfc: not in enabled drivers build config 00:00:47.346 net/softnic: not in enabled drivers build config 00:00:47.346 net/tap: not in enabled drivers build config 00:00:47.346 net/thunderx: not in enabled drivers build config 00:00:47.346 net/txgbe: not in enabled drivers build config 00:00:47.346 net/vdev_netvsc: not in enabled drivers build config 00:00:47.346 net/vhost: not in enabled drivers build config 00:00:47.346 net/virtio: not in enabled drivers build config 00:00:47.346 net/vmxnet3: not in enabled drivers build config 00:00:47.346 raw/cnxk_bphy: not in enabled drivers build config 00:00:47.346 raw/cnxk_gpio: not in enabled drivers build config 00:00:47.346 raw/dpaa2_cmdif: not in enabled drivers build config 00:00:47.346 raw/ifpga: not in enabled drivers build config 00:00:47.346 raw/ntb: not in enabled drivers build config 00:00:47.346 raw/skeleton: not in enabled drivers build config 00:00:47.346 crypto/armv8: not in enabled drivers build config 00:00:47.346 crypto/bcmfs: not in enabled drivers build config 00:00:47.346 crypto/caam_jr: not in enabled drivers build config 00:00:47.346 crypto/ccp: not in enabled drivers build config 00:00:47.346 crypto/cnxk: not in enabled drivers build config 00:00:47.346 crypto/dpaa_sec: not in enabled drivers build config 00:00:47.346 crypto/dpaa2_sec: not in enabled drivers build config 00:00:47.346 crypto/ipsec_mb: not in enabled drivers build config 00:00:47.346 crypto/mlx5: not in enabled drivers build config 00:00:47.346 crypto/mvsam: not in enabled drivers build config 00:00:47.346 crypto/nitrox: not in enabled drivers build config 00:00:47.346 crypto/null: not in enabled drivers build config 00:00:47.346 crypto/octeontx: not in enabled drivers build config 00:00:47.346 crypto/openssl: not in enabled drivers build config 00:00:47.346 crypto/scheduler: not in enabled drivers build config 00:00:47.346 crypto/uadk: not in enabled drivers build config 00:00:47.346 crypto/virtio: not in enabled drivers build config 00:00:47.346 compress/isal: not in enabled drivers build config 00:00:47.346 compress/mlx5: not in enabled drivers build config 00:00:47.346 compress/octeontx: not in enabled drivers build config 00:00:47.346 compress/zlib: not in enabled drivers build config 00:00:47.346 regex/mlx5: not in enabled drivers build config 00:00:47.346 regex/cn9k: not in enabled drivers build config 00:00:47.346 ml/cnxk: not in enabled drivers build config 00:00:47.346 vdpa/ifc: not in enabled drivers build config 00:00:47.346 vdpa/mlx5: not in enabled drivers build config 00:00:47.346 vdpa/nfp: not in enabled drivers build config 00:00:47.346 vdpa/sfc: not in enabled drivers build config 00:00:47.346 event/cnxk: not in enabled drivers build config 00:00:47.346 event/dlb2: not in enabled drivers build config 00:00:47.346 event/dpaa: not in enabled drivers build config 00:00:47.346 event/dpaa2: not in enabled drivers build config 00:00:47.346 event/dsw: not in enabled drivers build config 00:00:47.346 event/opdl: not in enabled drivers build config 00:00:47.346 event/skeleton: not in enabled drivers build config 00:00:47.346 event/sw: not in enabled drivers build config 00:00:47.346 event/octeontx: not in enabled drivers build config 00:00:47.346 baseband/acc: not in enabled drivers build config 00:00:47.346 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:00:47.346 baseband/fpga_lte_fec: not in enabled drivers build config 00:00:47.346 baseband/la12xx: not in enabled drivers build config 00:00:47.346 baseband/null: not in enabled drivers build config 00:00:47.346 baseband/turbo_sw: not in enabled drivers build config 00:00:47.346 gpu/cuda: not in enabled drivers build config 00:00:47.346 00:00:47.346 00:00:47.346 Build targets in project: 217 00:00:47.346 00:00:47.346 DPDK 23.11.0 00:00:47.346 00:00:47.346 User defined options 00:00:47.346 libdir : lib 00:00:47.346 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:00:47.346 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:00:47.346 c_link_args : 00:00:47.346 enable_docs : false 00:00:47.346 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:00:47.346 enable_kmods : false 00:00:47.346 machine : native 00:00:47.346 tests : false 00:00:47.346 00:00:47.346 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:00:47.346 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:00:47.346 11:24:16 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:00:47.346 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:00:47.610 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:00:47.610 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:00:47.610 [3/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:00:47.610 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:00:47.610 [5/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:00:47.610 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:00:47.610 [7/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:00:47.610 [8/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:00:47.610 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:00:47.610 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:00:47.872 [11/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:00:47.872 [12/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:00:47.872 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:00:47.872 [14/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:00:47.872 [15/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:00:47.872 [16/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:00:47.872 [17/707] Linking static target lib/librte_kvargs.a 00:00:47.872 [18/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:00:47.872 [19/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:00:47.872 [20/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:00:47.872 [21/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:00:47.872 [22/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:00:47.872 [23/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:00:47.872 [24/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:00:47.872 [25/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:00:47.872 [26/707] Linking static target lib/librte_pci.a 00:00:47.872 [27/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:00:47.872 [28/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:00:47.872 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:00:47.872 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:00:47.872 [31/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:00:47.872 [32/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:00:47.872 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:00:47.872 [34/707] Linking static target lib/librte_log.a 00:00:47.872 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:00:48.136 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:00:48.136 [37/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:00:48.136 [38/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:00:48.136 [39/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:00:48.136 [40/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:00:48.136 [41/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:00:48.136 [42/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:00:48.136 [43/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:00:48.136 [44/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:00:48.136 [45/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:00:48.396 [46/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:00:48.396 [47/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:00:48.396 [48/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:00:48.396 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:00:48.396 [50/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:00:48.396 [51/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:00:48.396 [52/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:00:48.396 [53/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:00:48.396 [54/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:00:48.396 [55/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:00:48.396 [56/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:00:48.396 [57/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:00:48.396 [58/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:00:48.396 [59/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:00:48.396 [60/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:00:48.396 [61/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:00:48.396 [62/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:00:48.396 [63/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:00:48.396 [64/707] Linking static target lib/librte_meter.a 00:00:48.396 [65/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:00:48.396 [66/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:00:48.396 [67/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:00:48.396 [68/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:00:48.396 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:00:48.396 [70/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:00:48.396 [71/707] Linking static target lib/librte_ring.a 00:00:48.396 [72/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:00:48.396 [73/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:00:48.396 [74/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:00:48.396 [75/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:00:48.396 [76/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:00:48.396 [77/707] Linking static target lib/librte_cmdline.a 00:00:48.396 [78/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:00:48.396 [79/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:00:48.396 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:00:48.396 [81/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:00:48.396 [82/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:00:48.396 [83/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:00:48.396 [84/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:00:48.396 [85/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:00:48.396 [86/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:00:48.396 [87/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:00:48.396 [88/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:00:48.396 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:00:48.396 [90/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:00:48.396 [91/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:00:48.396 [92/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:00:48.396 [93/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:00:48.396 [94/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:00:48.396 [95/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:00:48.396 [96/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:00:48.396 [97/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:00:48.396 [98/707] Linking static target lib/librte_metrics.a 00:00:48.396 [99/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:00:48.396 [100/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:00:48.396 [101/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:00:48.396 [102/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:00:48.656 [103/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:00:48.656 [104/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:00:48.656 [105/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:00:48.656 [106/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:00:48.656 [107/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:00:48.656 [108/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:00:48.656 [109/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:00:48.656 [110/707] Linking static target lib/librte_bitratestats.a 00:00:48.656 [111/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:00:48.656 [112/707] Linking static target lib/librte_cfgfile.a 00:00:48.656 [113/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:00:48.656 [114/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:00:48.656 [115/707] Linking static target lib/librte_net.a 00:00:48.656 [116/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:00:48.656 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:00:48.656 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:00:48.656 [119/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:00:48.656 [120/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:00:48.657 [121/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:00:48.657 [122/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:00:48.657 [123/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:00:48.657 [124/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:00:48.657 [125/707] Linking target lib/librte_log.so.24.0 00:00:48.657 [126/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:00:48.657 [127/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:00:48.657 [128/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:00:48.657 [129/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:00:48.657 [130/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:00:48.657 [131/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:00:48.657 [132/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:00:48.920 [133/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:00:48.920 [134/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:00:48.920 [135/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:00:48.920 [136/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:00:48.920 [137/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:00:48.920 [138/707] Linking static target lib/librte_timer.a 00:00:48.920 [139/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:00:48.920 [140/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:00:48.920 [141/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:00:48.920 [142/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:00:48.920 [143/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:00:48.920 [144/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:00:48.920 [145/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:00:48.920 [146/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:00:48.920 [147/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:00:48.920 [148/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:00:48.920 [149/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:00:48.920 [150/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:00:48.920 [151/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:00:48.920 [152/707] Linking static target lib/librte_bbdev.a 00:00:48.920 [153/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:00:48.920 [154/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:00:48.920 [155/707] Linking target lib/librte_kvargs.so.24.0 00:00:48.920 [156/707] Linking static target lib/librte_mempool.a 00:00:48.920 [157/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:00:48.920 [158/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:00:48.920 [159/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:00:48.920 [160/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.182 [161/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:00:49.182 [162/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:00:49.182 [163/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:00:49.182 [164/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:00:49.182 [165/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:00:49.182 [166/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:00:49.182 [167/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:00:49.182 [168/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:00:49.182 [169/707] Linking static target lib/librte_jobstats.a 00:00:49.183 [170/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:00:49.183 [171/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:00:49.183 [172/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.183 [173/707] Linking static target lib/librte_compressdev.a 00:00:49.183 [174/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:00:49.183 [175/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:00:49.183 [176/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.183 [177/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:00:49.183 [178/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:00:49.183 [179/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:00:49.183 [180/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:00:49.183 [181/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:00:49.183 [182/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:00:49.183 [183/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:00:49.183 [184/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:00:49.183 [185/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:00:49.183 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:00:49.183 [187/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:00:49.183 [188/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:00:49.183 [189/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:00:49.183 [190/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:00:49.183 [191/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:00:49.183 [192/707] Linking static target lib/librte_dispatcher.a 00:00:49.183 [193/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:00:49.183 [194/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:00:49.444 [195/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:00:49.444 [196/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:00:49.444 [197/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:00:49.444 [198/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:00:49.444 [199/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:00:49.444 [200/707] Linking static target lib/librte_latencystats.a 00:00:49.444 [201/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:00:49.444 [202/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:00:49.444 [203/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:00:49.444 [204/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:00:49.444 [205/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:00:49.444 [206/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:00:49.444 [207/707] Linking static target lib/librte_telemetry.a 00:00:49.444 [208/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:00:49.444 [209/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:00:49.444 [210/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.444 [211/707] Linking static target lib/librte_gpudev.a 00:00:49.444 [212/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:00:49.444 [213/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:00:49.444 [214/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:00:49.444 [215/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:00:49.444 [216/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:00:49.444 [217/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:00:49.444 [218/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:00:49.444 [219/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:00:49.444 [220/707] Linking static target lib/librte_rcu.a 00:00:49.444 [221/707] Linking static target lib/librte_gro.a 00:00:49.444 [222/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:00:49.444 [223/707] Linking static target lib/librte_eal.a 00:00:49.444 [224/707] Linking static target lib/librte_stack.a 00:00:49.444 [225/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:00:49.444 [226/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:00:49.444 [227/707] Linking static target lib/librte_dmadev.a 00:00:49.444 [228/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:00:49.444 [229/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:00:49.444 [230/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:00:49.444 [231/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:00:49.444 [232/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:00:49.444 [233/707] Linking static target lib/librte_gso.a 00:00:49.444 [234/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:00:49.444 [235/707] Linking static target lib/librte_distributor.a 00:00:49.444 [236/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:00:49.444 [237/707] Linking static target lib/librte_regexdev.a 00:00:49.444 [238/707] Linking static target lib/librte_rawdev.a 00:00:49.444 [239/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:00:49.444 [240/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:00:49.444 [241/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:00:49.444 [242/707] Linking static target lib/librte_power.a 00:00:49.444 [243/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:00:49.444 [244/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:00:49.708 [245/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:00:49.708 [246/707] Linking static target lib/librte_mldev.a 00:00:49.708 [247/707] Linking static target lib/librte_ip_frag.a 00:00:49.708 [248/707] Linking static target lib/librte_mbuf.a 00:00:49.708 [249/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.708 [250/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:00:49.708 [251/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:00:49.708 [252/707] Linking static target lib/librte_pcapng.a 00:00:49.708 [253/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:00:49.708 [254/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:00:49.708 [255/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.708 [256/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:00:49.708 [257/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:00:49.708 [258/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:00:49.708 [259/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.708 [260/707] Linking static target lib/librte_bpf.a 00:00:49.708 [261/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.708 [262/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:00:49.708 [263/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:00:49.708 [264/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:00:49.708 [265/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:00:49.708 [266/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:00:49.708 [267/707] Linking static target lib/librte_reorder.a 00:00:49.708 [268/707] Linking static target lib/librte_security.a 00:00:49.708 [269/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:00:49.708 [270/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.709 [271/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:00:49.709 [272/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:00:49.709 [273/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [274/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:00:49.974 [275/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:00:49.974 [276/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:00:49.974 [277/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [278/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [279/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:00:49.974 [280/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:00:49.974 [281/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:00:49.974 [282/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:00:49.974 [283/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:00:49.974 [284/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [285/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [286/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:00:49.974 [287/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:00:49.974 [288/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [289/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [290/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [291/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:00:49.974 [292/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [293/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [294/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:00:49.974 [295/707] Linking static target lib/librte_lpm.a 00:00:49.974 [296/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:00:49.974 [297/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:00:50.239 [298/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:00:50.239 [299/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:00:50.239 [300/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:00:50.239 [301/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:00:50.239 [302/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:00:50.239 [303/707] Linking static target lib/librte_rib.a 00:00:50.239 [304/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:00:50.239 [305/707] Linking target lib/librte_telemetry.so.24.0 00:00:50.239 [306/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:00:50.239 [307/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:00:50.239 [308/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:00:50.239 [309/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:00:50.239 [310/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:00:50.239 [311/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:00:50.239 [312/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:00:50.239 [313/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:00:50.239 [314/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.239 [315/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:00:50.239 [316/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:00:50.239 [317/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:00:50.239 [318/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:00:50.239 [319/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.239 [320/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:00:50.239 [321/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.239 [322/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:00:50.239 [323/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:00:50.239 [324/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:00:50.239 [325/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:00:50.503 [326/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:00:50.503 [327/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:00:50.503 [328/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:00:50.503 [329/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:00:50.503 [330/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:00:50.503 [331/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:00:50.503 [332/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:00:50.503 [333/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:00:50.503 [334/707] Linking static target lib/librte_efd.a 00:00:50.503 [335/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:00:50.503 [336/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:00:50.503 [337/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:00:50.503 [338/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:00:50.503 [339/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.503 [340/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:00:50.503 [341/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.503 [342/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:00:50.503 [343/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:00:50.503 [344/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:00:50.503 [345/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:00:50.503 [346/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:00:50.503 [347/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:00:50.503 [348/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.761 [349/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:00:50.761 [350/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:00:50.761 [351/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:00:50.761 [352/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:00:50.761 [353/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:00:50.761 [354/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.762 [355/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:00:50.762 [356/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:00:50.762 [357/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:00:50.762 [358/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:00:50.762 [359/707] Linking static target lib/librte_fib.a 00:00:50.762 [360/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:00:50.762 [361/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:00:50.762 [362/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:00:50.762 [363/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:00:50.762 [364/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.762 [365/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:00:50.762 [366/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.762 [367/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:00:50.762 [368/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:00:50.762 [369/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:00:50.762 [370/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:00:50.762 [371/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:00:50.762 [372/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:00:50.762 [373/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:00:50.762 [374/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:00:51.021 [375/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:00:51.021 [376/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:00:51.021 [377/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:00:51.021 [378/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:00:51.021 [379/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:00:51.021 [380/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:00:51.021 [381/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:00:51.021 [382/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:00:51.021 [383/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:00:51.021 [384/707] Linking static target lib/librte_pdump.a 00:00:51.021 [385/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:00:51.021 [386/707] Linking static target lib/librte_graph.a 00:00:51.021 [387/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:00:51.021 [388/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:00:51.021 [389/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:00:51.021 [390/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:00:51.021 [391/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:00:51.021 [392/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:00:51.021 [393/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:00:51.021 [394/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:00:51.021 [395/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:00:51.021 [396/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:00:51.021 [397/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:00:51.021 [398/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:00:51.021 [399/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:00:51.286 [400/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:00:51.286 [401/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:00:51.286 [402/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:00:51.286 [403/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:00:51.286 [404/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:00:51.286 [405/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:00:51.286 [406/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:00:51.286 [407/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:00:51.286 [408/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:00:51.286 [409/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:00:51.286 [410/707] Linking static target lib/librte_sched.a 00:00:51.286 [411/707] Linking static target drivers/librte_bus_vdev.a 00:00:51.286 [412/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:00:51.286 [413/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:00:51.286 [414/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:00:51.286 [415/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:00:51.286 [416/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:00:51.286 [417/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:00:51.286 [418/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:00:51.286 [419/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:00:51.286 [420/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:00:51.286 [421/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:00:51.286 [422/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:00:51.286 [423/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:00:51.286 [424/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:00:51.287 [425/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:00:51.287 [426/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:00:51.287 [427/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:00:51.287 [428/707] Linking static target lib/librte_table.a 00:00:51.543 [429/707] Linking static target lib/librte_cryptodev.a 00:00:51.543 [430/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:00:51.544 [431/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:00:51.544 [432/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:00:51.544 [433/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:00:51.544 [434/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:00:51.544 [435/707] Linking static target drivers/librte_bus_pci.a 00:00:51.544 [436/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:00:51.544 [437/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:00:51.544 [438/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:00:51.544 [439/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:00:51.544 [440/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:00:51.544 [441/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:00:51.544 [442/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:00:51.544 [443/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:00:51.544 [444/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:00:51.544 [445/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:00:51.544 [446/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:00:51.544 [447/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:51.803 [448/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:00:51.803 [449/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:00:51.803 [450/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:00:51.803 [451/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:00:51.803 [452/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:00:51.803 [453/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:00:51.803 [454/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:00:51.803 [455/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:00:51.803 [456/707] Linking static target lib/librte_ipsec.a 00:00:51.803 [457/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:00:51.803 [458/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:00:51.803 [459/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:00:51.803 [460/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:00:51.803 [461/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:00:51.803 [462/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:00:51.803 [463/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:51.803 [464/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:00:51.803 [465/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:00:51.803 [466/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:00:51.803 [467/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:00:51.803 [468/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:00:51.803 [469/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:00:51.803 [470/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:00:51.803 [471/707] Linking static target lib/librte_member.a 00:00:51.803 [472/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:00:51.803 [473/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:00:51.804 [474/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:00:52.060 [475/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:00:52.060 [476/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:00:52.060 [477/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:00:52.060 [478/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:00:52.060 [479/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:00:52.060 [480/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:00:52.060 [481/707] Linking static target lib/librte_node.a 00:00:52.060 [482/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:00:52.060 [483/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:00:52.060 [484/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:00:52.060 [485/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:00:52.060 [486/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:00:52.060 [487/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:00:52.060 [488/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:00:52.060 [489/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.060 [490/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:00:52.060 [491/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:00:52.060 [492/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:00:52.060 [493/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:00:52.060 [494/707] Linking static target drivers/librte_mempool_ring.a 00:00:52.060 [495/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:00:52.060 [496/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:00:52.060 [497/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:00:52.060 [498/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:00:52.060 [499/707] Linking static target lib/librte_hash.a 00:00:52.060 [500/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:00:52.060 [501/707] Linking static target lib/librte_pdcp.a 00:00:52.060 [502/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:00:52.060 [503/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:00:52.060 [504/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:00:52.060 [505/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:00:52.060 [506/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:00:52.060 [507/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:00:52.060 [508/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:00:52.060 [509/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:00:52.060 [510/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:00:52.060 [511/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.060 [512/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:00:52.060 [513/707] Linking static target lib/librte_port.a 00:00:52.317 [514/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:00:52.317 [515/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.317 [516/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:00:52.317 [517/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.317 [518/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:00:52.317 [519/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:00:52.317 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:00:52.317 [521/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:00:52.317 [522/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:00:52.317 [523/707] Linking static target lib/acl/libavx2_tmp.a 00:00:52.317 [524/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.317 [525/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:00:52.317 [526/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:00:52.317 [527/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:00:52.317 [528/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:00:52.317 [529/707] Linking static target lib/librte_eventdev.a 00:00:52.317 [530/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:00:52.317 [531/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:00:52.317 [532/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:00:52.317 [533/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.317 [534/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:00:52.317 [535/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:00:52.317 [536/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:00:52.317 [537/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:00:52.317 [538/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:00:52.317 [539/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:00:52.317 [540/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:00:52.317 [541/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:00:52.574 [542/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:00:52.574 [543/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:00:52.574 [544/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.574 [545/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:00:52.574 [546/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:00:52.574 [547/707] Linking static target lib/librte_acl.a 00:00:52.574 [548/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:00:52.574 [549/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:00:52.574 [550/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:00:52.574 [551/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:00:52.574 [552/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:00:52.574 [553/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:00:52.574 [554/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:00:52.574 [555/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:00:52.574 [556/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:00:52.574 [557/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:00:52.574 [558/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:00:52.831 [559/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:00:52.831 [560/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:00:52.831 [561/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:00:52.831 [562/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:00:52.831 [563/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:00:52.831 [564/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:00:52.831 [565/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.831 [566/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.831 [567/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:00:52.831 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:00:52.831 [569/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:00:53.088 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:00:53.088 [571/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:00:53.088 [572/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:53.344 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:00:53.344 [574/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:00:53.344 [575/707] Linking static target lib/librte_ethdev.a 00:00:53.602 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:00:53.602 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:00:53.858 [578/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:00:53.858 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:00:54.115 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:00:54.679 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:00:54.679 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:00:54.679 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:00:54.935 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:00:54.935 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:00:54.935 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:00:54.935 [587/707] Linking static target drivers/librte_net_i40e.a 00:00:55.192 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:00:55.756 [589/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:00:56.013 [590/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:00:56.270 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:00:56.528 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:01.786 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:01.786 [594/707] Linking target lib/librte_eal.so.24.0 00:01:01.786 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:01.786 [596/707] Linking target lib/librte_meter.so.24.0 00:01:01.786 [597/707] Linking target lib/librte_stack.so.24.0 00:01:01.786 [598/707] Linking target lib/librte_cfgfile.so.24.0 00:01:01.786 [599/707] Linking target lib/librte_timer.so.24.0 00:01:01.786 [600/707] Linking target drivers/librte_bus_vdev.so.24.0 00:01:01.786 [601/707] Linking target lib/librte_ring.so.24.0 00:01:01.786 [602/707] Linking target lib/librte_jobstats.so.24.0 00:01:01.786 [603/707] Linking target lib/librte_pci.so.24.0 00:01:01.786 [604/707] Linking target lib/librte_dmadev.so.24.0 00:01:01.786 [605/707] Linking target lib/librte_rawdev.so.24.0 00:01:01.786 [606/707] Linking target lib/librte_acl.so.24.0 00:01:01.786 [607/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:01:01.786 [608/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:01.786 [609/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:01.786 [610/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:01.786 [611/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:01.786 [612/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:01.786 [613/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:01:01.786 [614/707] Linking target lib/librte_mempool.so.24.0 00:01:01.786 [615/707] Linking target lib/librte_rcu.so.24.0 00:01:01.786 [616/707] Linking target drivers/librte_bus_pci.so.24.0 00:01:01.786 [617/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:01.786 [618/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:01:01.786 [619/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:01.786 [620/707] Linking target drivers/librte_mempool_ring.so.24.0 00:01:02.043 [621/707] Linking target lib/librte_mbuf.so.24.0 00:01:02.043 [622/707] Linking target lib/librte_rib.so.24.0 00:01:02.043 [623/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:02.043 [624/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:01:02.043 [625/707] Linking target lib/librte_sched.so.24.0 00:01:02.043 [626/707] Linking target lib/librte_fib.so.24.0 00:01:02.043 [627/707] Linking target lib/librte_distributor.so.24.0 00:01:02.043 [628/707] Linking target lib/librte_regexdev.so.24.0 00:01:02.043 [629/707] Linking target lib/librte_compressdev.so.24.0 00:01:02.043 [630/707] Linking target lib/librte_net.so.24.0 00:01:02.043 [631/707] Linking target lib/librte_gpudev.so.24.0 00:01:02.043 [632/707] Linking target lib/librte_mldev.so.24.0 00:01:02.043 [633/707] Linking target lib/librte_bbdev.so.24.0 00:01:02.043 [634/707] Linking target lib/librte_reorder.so.24.0 00:01:02.043 [635/707] Linking target lib/librte_cryptodev.so.24.0 00:01:02.043 [636/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:02.300 [637/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:01:02.300 [638/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:02.300 [639/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:02.300 [640/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:01:02.300 [641/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:02.300 [642/707] Linking target lib/librte_hash.so.24.0 00:01:02.300 [643/707] Linking target lib/librte_cmdline.so.24.0 00:01:02.300 [644/707] Linking target lib/librte_security.so.24.0 00:01:02.300 [645/707] Linking static target lib/librte_pipeline.a 00:01:02.300 [646/707] Linking target lib/librte_ethdev.so.24.0 00:01:02.300 [647/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:02.557 [648/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:01:02.557 [649/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:02.557 [650/707] Linking target lib/librte_member.so.24.0 00:01:02.557 [651/707] Linking target lib/librte_efd.so.24.0 00:01:02.557 [652/707] Linking target lib/librte_lpm.so.24.0 00:01:02.557 [653/707] Linking target lib/librte_pdcp.so.24.0 00:01:02.557 [654/707] Linking target lib/librte_ipsec.so.24.0 00:01:02.557 [655/707] Linking target lib/librte_metrics.so.24.0 00:01:02.557 [656/707] Linking target lib/librte_ip_frag.so.24.0 00:01:02.557 [657/707] Linking target lib/librte_gso.so.24.0 00:01:02.557 [658/707] Linking target lib/librte_pcapng.so.24.0 00:01:02.557 [659/707] Linking target lib/librte_power.so.24.0 00:01:02.557 [660/707] Linking target lib/librte_gro.so.24.0 00:01:02.557 [661/707] Linking target lib/librte_bpf.so.24.0 00:01:02.557 [662/707] Linking target lib/librte_eventdev.so.24.0 00:01:02.557 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:01:02.557 [664/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:01:02.557 [665/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:01:02.557 [666/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:01:02.557 [667/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:01:02.557 [668/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:01:02.557 [669/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:01:02.557 [670/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:01:02.557 [671/707] Linking target lib/librte_latencystats.so.24.0 00:01:02.557 [672/707] Linking target lib/librte_bitratestats.so.24.0 00:01:02.814 [673/707] Linking target lib/librte_pdump.so.24.0 00:01:02.814 [674/707] Linking target lib/librte_dispatcher.so.24.0 00:01:02.814 [675/707] Linking target lib/librte_port.so.24.0 00:01:02.814 [676/707] Linking target lib/librte_graph.so.24.0 00:01:02.814 [677/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:01:02.814 [678/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:02.815 [679/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:01:02.815 [680/707] Linking static target lib/librte_vhost.a 00:01:02.815 [681/707] Linking target lib/librte_table.so.24.0 00:01:02.815 [682/707] Linking target lib/librte_node.so.24.0 00:01:03.070 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:01:03.326 [684/707] Linking target app/dpdk-test-cmdline 00:01:03.326 [685/707] Linking target app/dpdk-dumpcap 00:01:03.326 [686/707] Linking target app/dpdk-test-pipeline 00:01:03.326 [687/707] Linking target app/dpdk-test-flow-perf 00:01:03.326 [688/707] Linking target app/dpdk-test-fib 00:01:03.326 [689/707] Linking target app/dpdk-pdump 00:01:03.326 [690/707] Linking target app/dpdk-proc-info 00:01:03.326 [691/707] Linking target app/dpdk-test-regex 00:01:03.326 [692/707] Linking target app/dpdk-test-security-perf 00:01:03.326 [693/707] Linking target app/dpdk-test-acl 00:01:03.326 [694/707] Linking target app/dpdk-test-sad 00:01:03.326 [695/707] Linking target app/dpdk-test-crypto-perf 00:01:03.327 [696/707] Linking target app/dpdk-test-gpudev 00:01:03.327 [697/707] Linking target app/dpdk-test-mldev 00:01:03.327 [698/707] Linking target app/dpdk-test-compress-perf 00:01:03.327 [699/707] Linking target app/dpdk-test-dma-perf 00:01:03.327 [700/707] Linking target app/dpdk-graph 00:01:03.327 [701/707] Linking target app/dpdk-test-bbdev 00:01:03.327 [702/707] Linking target app/dpdk-test-eventdev 00:01:03.327 [703/707] Linking target app/dpdk-testpmd 00:01:05.218 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:05.218 [705/707] Linking target lib/librte_vhost.so.24.0 00:01:08.498 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:08.498 [707/707] Linking target lib/librte_pipeline.so.24.0 00:01:08.498 11:24:37 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:08.498 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:08.498 [0/1] Installing files. 00:01:08.498 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:08.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.500 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.501 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.502 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:08.503 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:08.503 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.503 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:08.504 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:08.504 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:08.504 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:08.504 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:08.504 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.504 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.505 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.766 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.767 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.768 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:08.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:08.769 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:01:08.769 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:01:08.769 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:01:08.769 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:08.769 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:01:08.769 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:08.769 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:01:08.769 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:08.769 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:01:08.769 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:08.769 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:01:08.769 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:08.769 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:01:08.769 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:08.769 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:01:08.769 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:08.769 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:01:08.769 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:08.769 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:01:08.769 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:08.769 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:01:08.769 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:08.769 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:01:08.769 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:08.769 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:01:08.769 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:08.769 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:01:08.769 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:08.769 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:01:08.769 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:08.769 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:01:08.769 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:08.769 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:01:08.769 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:08.769 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:01:08.769 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:08.770 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:01:08.770 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:08.770 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:01:08.770 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:08.770 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:01:08.770 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:08.770 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:01:08.770 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:08.770 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:01:08.770 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:08.770 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:01:08.770 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:08.770 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:01:08.770 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:08.770 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:01:08.770 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:08.770 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:01:08.770 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:08.770 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:01:08.770 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:01:08.770 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:01:08.770 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:08.770 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:01:08.770 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:08.770 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:01:08.770 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:08.770 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:01:08.770 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:08.770 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:01:08.770 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:08.770 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:01:08.770 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:08.770 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:01:08.770 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:08.770 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:01:08.770 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:08.770 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:01:08.770 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:08.770 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:01:08.770 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:08.770 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:01:08.770 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:08.770 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:01:08.770 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:08.770 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:01:08.770 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:01:08.770 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:01:08.770 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:08.770 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:01:08.770 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:08.770 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:01:08.770 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:08.770 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:01:08.770 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:08.770 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:01:08.770 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:08.770 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:01:08.770 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:08.770 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:01:08.770 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:08.770 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:01:08.770 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:01:08.770 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:01:08.770 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:08.770 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:01:08.770 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:08.770 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:01:08.770 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:08.770 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:01:08.770 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:08.770 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:01:08.770 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:08.770 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:01:08.770 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:08.770 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:01:08.770 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:08.770 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:01:08.770 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:01:08.770 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:01:08.770 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:01:08.770 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:01:08.770 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:01:08.770 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:01:08.770 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:01:08.771 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:01:08.771 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:01:08.771 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:01:08.771 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:01:08.771 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:01:08.771 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:01:08.771 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:01:08.771 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:01:08.771 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:01:08.771 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:01:08.771 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:01:08.771 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:01:08.771 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:01:08.771 11:24:38 -- common/autobuild_common.sh@189 -- $ uname -s 00:01:08.771 11:24:38 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:08.771 11:24:38 -- common/autobuild_common.sh@200 -- $ cat 00:01:08.771 11:24:38 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:08.771 00:01:08.771 real 0m27.260s 00:01:08.771 user 7m59.585s 00:01:08.771 sys 2m29.896s 00:01:08.771 11:24:38 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:08.771 11:24:38 -- common/autotest_common.sh@10 -- $ set +x 00:01:08.771 ************************************ 00:01:08.771 END TEST build_native_dpdk 00:01:08.771 ************************************ 00:01:08.771 11:24:38 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:08.771 11:24:38 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:08.771 11:24:38 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:08.771 11:24:38 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:08.771 11:24:38 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:08.771 11:24:38 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:08.771 11:24:38 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:08.771 11:24:38 -- common/autotest_common.sh@10 -- $ set +x 00:01:08.771 ************************************ 00:01:08.771 START TEST autobuild_llvm_precompile 00:01:08.771 ************************************ 00:01:08.771 11:24:38 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:01:08.771 11:24:38 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:08.771 11:24:38 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:01:08.771 Target: x86_64-redhat-linux-gnu 00:01:08.771 Thread model: posix 00:01:08.771 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:08.771 11:24:38 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:01:08.771 11:24:38 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:01:08.771 11:24:38 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:01:08.771 11:24:38 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:01:08.771 11:24:38 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:01:08.771 11:24:38 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:01:08.771 11:24:38 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:08.771 11:24:38 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:01:08.771 11:24:38 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:01:08.771 11:24:38 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:09.029 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:09.287 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:09.287 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:09.287 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:09.854 Using 'verbs' RDMA provider 00:01:25.402 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:40.277 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:40.277 Creating mk/config.mk...done. 00:01:40.277 Creating mk/cc.flags.mk...done. 00:01:40.277 Type 'make' to build. 00:01:40.277 00:01:40.277 real 0m29.584s 00:01:40.277 user 0m12.483s 00:01:40.277 sys 0m16.468s 00:01:40.277 11:25:07 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:40.277 11:25:07 -- common/autotest_common.sh@10 -- $ set +x 00:01:40.277 ************************************ 00:01:40.277 END TEST autobuild_llvm_precompile 00:01:40.277 ************************************ 00:01:40.277 11:25:07 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:40.277 11:25:07 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:40.277 11:25:07 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:40.277 11:25:07 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:40.277 11:25:07 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:40.277 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:40.277 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.277 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.277 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:40.277 Using 'verbs' RDMA provider 00:01:52.471 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:04.664 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:04.664 Creating mk/config.mk...done. 00:02:04.664 Creating mk/cc.flags.mk...done. 00:02:04.664 Type 'make' to build. 00:02:04.664 11:25:32 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:04.664 11:25:32 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:04.664 11:25:32 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:04.664 11:25:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:04.664 ************************************ 00:02:04.664 START TEST make 00:02:04.664 ************************************ 00:02:04.664 11:25:32 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:04.664 make[1]: Nothing to be done for 'all'. 00:02:06.038 The Meson build system 00:02:06.038 Version: 1.3.1 00:02:06.038 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:06.038 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:06.038 Build type: native build 00:02:06.038 Project name: libvfio-user 00:02:06.038 Project version: 0.0.1 00:02:06.038 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:06.038 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:06.038 Host machine cpu family: x86_64 00:02:06.038 Host machine cpu: x86_64 00:02:06.038 Run-time dependency threads found: YES 00:02:06.038 Library dl found: YES 00:02:06.038 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:06.038 Run-time dependency json-c found: YES 0.17 00:02:06.038 Run-time dependency cmocka found: YES 1.1.7 00:02:06.038 Program pytest-3 found: NO 00:02:06.038 Program flake8 found: NO 00:02:06.038 Program misspell-fixer found: NO 00:02:06.038 Program restructuredtext-lint found: NO 00:02:06.038 Program valgrind found: YES (/usr/bin/valgrind) 00:02:06.038 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:06.038 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:06.038 Compiler for C supports arguments -Wwrite-strings: YES 00:02:06.038 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:06.038 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:06.038 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:06.038 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:06.038 Build targets in project: 8 00:02:06.038 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:06.038 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:06.038 00:02:06.038 libvfio-user 0.0.1 00:02:06.038 00:02:06.038 User defined options 00:02:06.038 buildtype : debug 00:02:06.038 default_library: static 00:02:06.038 libdir : /usr/local/lib 00:02:06.038 00:02:06.038 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:06.038 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:06.038 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:06.038 [2/36] Compiling C object samples/null.p/null.c.o 00:02:06.038 [3/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:06.038 [4/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:06.038 [5/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:06.038 [6/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:06.038 [7/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:06.038 [8/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:06.038 [9/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:06.038 [10/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:06.038 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:06.038 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:06.038 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:06.038 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:06.038 [15/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:06.038 [16/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:06.297 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:06.297 [18/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:06.297 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:06.297 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:06.297 [21/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:06.297 [22/36] Compiling C object samples/server.p/server.c.o 00:02:06.297 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:06.297 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:06.297 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:06.297 [26/36] Compiling C object samples/client.p/client.c.o 00:02:06.297 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:06.297 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:06.297 [29/36] Linking static target lib/libvfio-user.a 00:02:06.297 [30/36] Linking target samples/client 00:02:06.297 [31/36] Linking target test/unit_tests 00:02:06.297 [32/36] Linking target samples/server 00:02:06.297 [33/36] Linking target samples/null 00:02:06.297 [34/36] Linking target samples/lspci 00:02:06.297 [35/36] Linking target samples/gpio-pci-idio-16 00:02:06.297 [36/36] Linking target samples/shadow_ioeventfd_server 00:02:06.297 INFO: autodetecting backend as ninja 00:02:06.297 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:06.297 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:06.554 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:06.554 ninja: no work to do. 00:02:09.845 CC lib/ut/ut.o 00:02:09.845 CC lib/log/log.o 00:02:09.845 CC lib/log/log_flags.o 00:02:09.845 CC lib/log/log_deprecated.o 00:02:09.845 CC lib/ut_mock/mock.o 00:02:09.845 LIB libspdk_ut.a 00:02:09.845 LIB libspdk_ut_mock.a 00:02:09.845 LIB libspdk_log.a 00:02:10.103 CXX lib/trace_parser/trace.o 00:02:10.103 CC lib/util/base64.o 00:02:10.103 CC lib/util/bit_array.o 00:02:10.103 CC lib/util/cpuset.o 00:02:10.103 CC lib/util/crc16.o 00:02:10.103 CC lib/util/crc32.o 00:02:10.103 CC lib/util/crc32c.o 00:02:10.103 CC lib/dma/dma.o 00:02:10.103 CC lib/util/crc32_ieee.o 00:02:10.103 CC lib/util/crc64.o 00:02:10.103 CC lib/ioat/ioat.o 00:02:10.103 CC lib/util/dif.o 00:02:10.103 CC lib/util/fd.o 00:02:10.103 CC lib/util/file.o 00:02:10.103 CC lib/util/hexlify.o 00:02:10.103 CC lib/util/iov.o 00:02:10.103 CC lib/util/math.o 00:02:10.103 CC lib/util/pipe.o 00:02:10.103 CC lib/util/strerror_tls.o 00:02:10.103 CC lib/util/string.o 00:02:10.103 CC lib/util/fd_group.o 00:02:10.103 CC lib/util/uuid.o 00:02:10.103 CC lib/util/xor.o 00:02:10.103 CC lib/util/zipf.o 00:02:10.103 CC lib/vfio_user/host/vfio_user_pci.o 00:02:10.103 CC lib/vfio_user/host/vfio_user.o 00:02:10.103 LIB libspdk_dma.a 00:02:10.361 LIB libspdk_ioat.a 00:02:10.361 LIB libspdk_vfio_user.a 00:02:10.361 LIB libspdk_util.a 00:02:10.361 LIB libspdk_trace_parser.a 00:02:10.620 CC lib/conf/conf.o 00:02:10.620 CC lib/env_dpdk/env.o 00:02:10.620 CC lib/env_dpdk/memory.o 00:02:10.620 CC lib/env_dpdk/pci.o 00:02:10.620 CC lib/env_dpdk/init.o 00:02:10.620 CC lib/env_dpdk/threads.o 00:02:10.620 CC lib/env_dpdk/pci_ioat.o 00:02:10.620 CC lib/env_dpdk/pci_virtio.o 00:02:10.620 CC lib/idxd/idxd.o 00:02:10.620 CC lib/env_dpdk/pci_vmd.o 00:02:10.620 CC lib/vmd/vmd.o 00:02:10.620 CC lib/idxd/idxd_user.o 00:02:10.620 CC lib/env_dpdk/pci_idxd.o 00:02:10.620 CC lib/vmd/led.o 00:02:10.620 CC lib/idxd/idxd_kernel.o 00:02:10.620 CC lib/env_dpdk/pci_event.o 00:02:10.620 CC lib/json/json_util.o 00:02:10.620 CC lib/json/json_parse.o 00:02:10.620 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:10.620 CC lib/env_dpdk/sigbus_handler.o 00:02:10.620 CC lib/env_dpdk/pci_dpdk.o 00:02:10.620 CC lib/rdma/common.o 00:02:10.620 CC lib/json/json_write.o 00:02:10.620 CC lib/rdma/rdma_verbs.o 00:02:10.620 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:10.878 LIB libspdk_conf.a 00:02:10.878 LIB libspdk_json.a 00:02:10.878 LIB libspdk_rdma.a 00:02:11.136 LIB libspdk_idxd.a 00:02:11.136 LIB libspdk_vmd.a 00:02:11.136 CC lib/jsonrpc/jsonrpc_server.o 00:02:11.136 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:11.136 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:11.136 CC lib/jsonrpc/jsonrpc_client.o 00:02:11.394 LIB libspdk_jsonrpc.a 00:02:11.653 LIB libspdk_env_dpdk.a 00:02:11.653 CC lib/rpc/rpc.o 00:02:11.912 LIB libspdk_rpc.a 00:02:12.171 CC lib/sock/sock_rpc.o 00:02:12.171 CC lib/sock/sock.o 00:02:12.171 CC lib/trace/trace.o 00:02:12.171 CC lib/trace/trace_flags.o 00:02:12.171 CC lib/trace/trace_rpc.o 00:02:12.171 CC lib/notify/notify.o 00:02:12.171 CC lib/notify/notify_rpc.o 00:02:12.171 LIB libspdk_notify.a 00:02:12.430 LIB libspdk_trace.a 00:02:12.430 LIB libspdk_sock.a 00:02:12.689 CC lib/thread/thread.o 00:02:12.689 CC lib/thread/iobuf.o 00:02:12.689 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:12.689 CC lib/nvme/nvme_ctrlr.o 00:02:12.689 CC lib/nvme/nvme_ns.o 00:02:12.689 CC lib/nvme/nvme_fabric.o 00:02:12.689 CC lib/nvme/nvme_ns_cmd.o 00:02:12.689 CC lib/nvme/nvme_pcie_common.o 00:02:12.689 CC lib/nvme/nvme.o 00:02:12.689 CC lib/nvme/nvme_pcie.o 00:02:12.689 CC lib/nvme/nvme_quirks.o 00:02:12.689 CC lib/nvme/nvme_qpair.o 00:02:12.689 CC lib/nvme/nvme_transport.o 00:02:12.689 CC lib/nvme/nvme_discovery.o 00:02:12.689 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:12.689 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:12.689 CC lib/nvme/nvme_tcp.o 00:02:12.689 CC lib/nvme/nvme_opal.o 00:02:12.689 CC lib/nvme/nvme_io_msg.o 00:02:12.690 CC lib/nvme/nvme_poll_group.o 00:02:12.690 CC lib/nvme/nvme_zns.o 00:02:12.690 CC lib/nvme/nvme_cuse.o 00:02:12.690 CC lib/nvme/nvme_vfio_user.o 00:02:12.690 CC lib/nvme/nvme_rdma.o 00:02:13.627 LIB libspdk_thread.a 00:02:13.627 CC lib/virtio/virtio.o 00:02:13.627 CC lib/virtio/virtio_vhost_user.o 00:02:13.627 CC lib/virtio/virtio_vfio_user.o 00:02:13.627 CC lib/virtio/virtio_pci.o 00:02:13.627 CC lib/blob/blobstore.o 00:02:13.627 CC lib/blob/request.o 00:02:13.627 CC lib/blob/zeroes.o 00:02:13.627 CC lib/blob/blob_bs_dev.o 00:02:13.627 CC lib/accel/accel.o 00:02:13.627 CC lib/accel/accel_rpc.o 00:02:13.627 CC lib/accel/accel_sw.o 00:02:13.627 CC lib/init/subsystem.o 00:02:13.627 CC lib/init/json_config.o 00:02:13.627 CC lib/init/rpc.o 00:02:13.627 CC lib/init/subsystem_rpc.o 00:02:13.627 CC lib/vfu_tgt/tgt_endpoint.o 00:02:13.627 CC lib/vfu_tgt/tgt_rpc.o 00:02:13.885 LIB libspdk_virtio.a 00:02:13.885 LIB libspdk_init.a 00:02:13.885 LIB libspdk_nvme.a 00:02:13.885 LIB libspdk_vfu_tgt.a 00:02:14.143 CC lib/event/reactor.o 00:02:14.143 CC lib/event/app.o 00:02:14.143 CC lib/event/log_rpc.o 00:02:14.143 CC lib/event/app_rpc.o 00:02:14.143 CC lib/event/scheduler_static.o 00:02:14.403 LIB libspdk_accel.a 00:02:14.403 LIB libspdk_event.a 00:02:14.661 CC lib/bdev/bdev.o 00:02:14.661 CC lib/bdev/bdev_rpc.o 00:02:14.661 CC lib/bdev/bdev_zone.o 00:02:14.661 CC lib/bdev/part.o 00:02:14.661 CC lib/bdev/scsi_nvme.o 00:02:15.246 LIB libspdk_blob.a 00:02:15.616 CC lib/lvol/lvol.o 00:02:15.616 CC lib/blobfs/blobfs.o 00:02:15.616 CC lib/blobfs/tree.o 00:02:15.946 LIB libspdk_lvol.a 00:02:16.204 LIB libspdk_blobfs.a 00:02:16.204 LIB libspdk_bdev.a 00:02:16.770 CC lib/ublk/ublk.o 00:02:16.770 CC lib/ublk/ublk_rpc.o 00:02:16.770 CC lib/nbd/nbd.o 00:02:16.770 CC lib/nbd/nbd_rpc.o 00:02:16.770 CC lib/ftl/ftl_init.o 00:02:16.770 CC lib/ftl/ftl_core.o 00:02:16.770 CC lib/ftl/ftl_layout.o 00:02:16.770 CC lib/ftl/ftl_io.o 00:02:16.770 CC lib/ftl/ftl_debug.o 00:02:16.770 CC lib/ftl/ftl_l2p.o 00:02:16.770 CC lib/nvmf/ctrlr_discovery.o 00:02:16.770 CC lib/ftl/ftl_sb.o 00:02:16.770 CC lib/scsi/lun.o 00:02:16.770 CC lib/nvmf/ctrlr.o 00:02:16.770 CC lib/scsi/dev.o 00:02:16.770 CC lib/scsi/port.o 00:02:16.770 CC lib/ftl/ftl_l2p_flat.o 00:02:16.770 CC lib/nvmf/ctrlr_bdev.o 00:02:16.770 CC lib/ftl/ftl_nv_cache.o 00:02:16.770 CC lib/nvmf/subsystem.o 00:02:16.770 CC lib/ftl/ftl_band.o 00:02:16.770 CC lib/nvmf/nvmf.o 00:02:16.770 CC lib/ftl/ftl_writer.o 00:02:16.770 CC lib/scsi/scsi.o 00:02:16.770 CC lib/ftl/ftl_band_ops.o 00:02:16.770 CC lib/nvmf/nvmf_rpc.o 00:02:16.770 CC lib/scsi/scsi_bdev.o 00:02:16.770 CC lib/nvmf/transport.o 00:02:16.770 CC lib/scsi/scsi_pr.o 00:02:16.770 CC lib/scsi/scsi_rpc.o 00:02:16.770 CC lib/nvmf/tcp.o 00:02:16.770 CC lib/scsi/task.o 00:02:16.770 CC lib/ftl/ftl_rq.o 00:02:16.770 CC lib/nvmf/vfio_user.o 00:02:16.770 CC lib/ftl/ftl_reloc.o 00:02:16.770 CC lib/nvmf/rdma.o 00:02:16.770 CC lib/ftl/ftl_l2p_cache.o 00:02:16.770 CC lib/ftl/ftl_p2l.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:16.770 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:16.770 CC lib/ftl/utils/ftl_conf.o 00:02:16.770 CC lib/ftl/utils/ftl_bitmap.o 00:02:16.770 CC lib/ftl/utils/ftl_md.o 00:02:16.770 CC lib/ftl/utils/ftl_mempool.o 00:02:16.770 CC lib/ftl/utils/ftl_property.o 00:02:16.770 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:16.770 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:16.770 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:16.770 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:16.770 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:16.770 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:16.770 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:16.770 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:16.770 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:16.770 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:16.770 CC lib/ftl/base/ftl_base_bdev.o 00:02:16.770 CC lib/ftl/base/ftl_base_dev.o 00:02:16.770 CC lib/ftl/ftl_trace.o 00:02:17.027 LIB libspdk_scsi.a 00:02:17.027 LIB libspdk_nbd.a 00:02:17.027 LIB libspdk_ublk.a 00:02:17.284 LIB libspdk_ftl.a 00:02:17.284 CC lib/iscsi/iscsi.o 00:02:17.284 CC lib/iscsi/conn.o 00:02:17.284 CC lib/iscsi/init_grp.o 00:02:17.284 CC lib/iscsi/md5.o 00:02:17.284 CC lib/iscsi/param.o 00:02:17.284 CC lib/iscsi/iscsi_rpc.o 00:02:17.284 CC lib/iscsi/portal_grp.o 00:02:17.284 CC lib/iscsi/tgt_node.o 00:02:17.284 CC lib/iscsi/iscsi_subsystem.o 00:02:17.284 CC lib/iscsi/task.o 00:02:17.284 CC lib/vhost/vhost.o 00:02:17.284 CC lib/vhost/vhost_rpc.o 00:02:17.284 CC lib/vhost/vhost_scsi.o 00:02:17.284 CC lib/vhost/vhost_blk.o 00:02:17.284 CC lib/vhost/rte_vhost_user.o 00:02:17.849 LIB libspdk_nvmf.a 00:02:17.849 LIB libspdk_vhost.a 00:02:18.108 LIB libspdk_iscsi.a 00:02:18.365 CC module/env_dpdk/env_dpdk_rpc.o 00:02:18.622 CC module/vfu_device/vfu_virtio_scsi.o 00:02:18.622 CC module/vfu_device/vfu_virtio_blk.o 00:02:18.622 CC module/vfu_device/vfu_virtio.o 00:02:18.622 CC module/vfu_device/vfu_virtio_rpc.o 00:02:18.622 LIB libspdk_env_dpdk_rpc.a 00:02:18.622 CC module/blob/bdev/blob_bdev.o 00:02:18.622 CC module/sock/posix/posix.o 00:02:18.622 CC module/accel/dsa/accel_dsa_rpc.o 00:02:18.622 CC module/accel/dsa/accel_dsa.o 00:02:18.622 CC module/accel/ioat/accel_ioat.o 00:02:18.622 CC module/accel/ioat/accel_ioat_rpc.o 00:02:18.622 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:18.622 CC module/accel/error/accel_error.o 00:02:18.622 CC module/scheduler/gscheduler/gscheduler.o 00:02:18.622 CC module/accel/iaa/accel_iaa.o 00:02:18.622 CC module/accel/error/accel_error_rpc.o 00:02:18.622 CC module/accel/iaa/accel_iaa_rpc.o 00:02:18.622 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:18.878 LIB libspdk_scheduler_gscheduler.a 00:02:18.878 LIB libspdk_scheduler_dpdk_governor.a 00:02:18.878 LIB libspdk_accel_error.a 00:02:18.878 LIB libspdk_accel_ioat.a 00:02:18.878 LIB libspdk_scheduler_dynamic.a 00:02:18.878 LIB libspdk_blob_bdev.a 00:02:18.878 LIB libspdk_accel_iaa.a 00:02:18.878 LIB libspdk_accel_dsa.a 00:02:18.878 LIB libspdk_vfu_device.a 00:02:19.135 LIB libspdk_sock_posix.a 00:02:19.135 CC module/bdev/gpt/vbdev_gpt.o 00:02:19.135 CC module/bdev/gpt/gpt.o 00:02:19.135 CC module/bdev/iscsi/bdev_iscsi.o 00:02:19.135 CC module/bdev/error/vbdev_error.o 00:02:19.135 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:19.135 CC module/bdev/raid/bdev_raid.o 00:02:19.135 CC module/bdev/delay/vbdev_delay.o 00:02:19.135 CC module/bdev/error/vbdev_error_rpc.o 00:02:19.135 CC module/bdev/raid/bdev_raid_rpc.o 00:02:19.135 CC module/bdev/passthru/vbdev_passthru.o 00:02:19.135 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:19.135 CC module/bdev/raid/bdev_raid_sb.o 00:02:19.135 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:19.135 CC module/bdev/raid/raid0.o 00:02:19.135 CC module/bdev/raid/raid1.o 00:02:19.135 CC module/bdev/raid/concat.o 00:02:19.135 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:19.135 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:19.135 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:19.135 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:19.135 CC module/bdev/nvme/nvme_rpc.o 00:02:19.135 CC module/bdev/nvme/bdev_nvme.o 00:02:19.135 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:19.135 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:19.135 CC module/bdev/nvme/bdev_mdns_client.o 00:02:19.135 CC module/bdev/nvme/vbdev_opal.o 00:02:19.135 CC module/bdev/null/bdev_null.o 00:02:19.135 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:19.135 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:19.135 CC module/bdev/null/bdev_null_rpc.o 00:02:19.135 CC module/bdev/split/vbdev_split.o 00:02:19.135 CC module/bdev/split/vbdev_split_rpc.o 00:02:19.135 CC module/bdev/lvol/vbdev_lvol.o 00:02:19.135 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:19.135 CC module/bdev/ftl/bdev_ftl.o 00:02:19.135 CC module/bdev/malloc/bdev_malloc.o 00:02:19.135 CC module/bdev/aio/bdev_aio.o 00:02:19.135 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:19.136 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:19.136 CC module/bdev/aio/bdev_aio_rpc.o 00:02:19.136 CC module/blobfs/bdev/blobfs_bdev.o 00:02:19.136 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:19.393 LIB libspdk_bdev_split.a 00:02:19.393 LIB libspdk_bdev_gpt.a 00:02:19.393 LIB libspdk_blobfs_bdev.a 00:02:19.393 LIB libspdk_bdev_error.a 00:02:19.393 LIB libspdk_bdev_null.a 00:02:19.393 LIB libspdk_bdev_passthru.a 00:02:19.393 LIB libspdk_bdev_ftl.a 00:02:19.393 LIB libspdk_bdev_iscsi.a 00:02:19.393 LIB libspdk_bdev_zone_block.a 00:02:19.393 LIB libspdk_bdev_aio.a 00:02:19.393 LIB libspdk_bdev_delay.a 00:02:19.393 LIB libspdk_bdev_malloc.a 00:02:19.650 LIB libspdk_bdev_lvol.a 00:02:19.650 LIB libspdk_bdev_virtio.a 00:02:19.650 LIB libspdk_bdev_raid.a 00:02:20.584 LIB libspdk_bdev_nvme.a 00:02:21.151 CC module/event/subsystems/sock/sock.o 00:02:21.151 CC module/event/subsystems/scheduler/scheduler.o 00:02:21.151 CC module/event/subsystems/vmd/vmd.o 00:02:21.151 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:21.151 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:21.151 CC module/event/subsystems/iobuf/iobuf.o 00:02:21.151 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:21.151 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:21.151 LIB libspdk_event_sock.a 00:02:21.151 LIB libspdk_event_scheduler.a 00:02:21.151 LIB libspdk_event_vmd.a 00:02:21.151 LIB libspdk_event_vfu_tgt.a 00:02:21.151 LIB libspdk_event_vhost_blk.a 00:02:21.151 LIB libspdk_event_iobuf.a 00:02:21.410 CC module/event/subsystems/accel/accel.o 00:02:21.668 LIB libspdk_event_accel.a 00:02:21.927 CC module/event/subsystems/bdev/bdev.o 00:02:21.927 LIB libspdk_event_bdev.a 00:02:22.493 CC module/event/subsystems/nbd/nbd.o 00:02:22.493 CC module/event/subsystems/ublk/ublk.o 00:02:22.493 CC module/event/subsystems/scsi/scsi.o 00:02:22.493 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:22.493 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:22.493 LIB libspdk_event_nbd.a 00:02:22.493 LIB libspdk_event_ublk.a 00:02:22.493 LIB libspdk_event_scsi.a 00:02:22.493 LIB libspdk_event_nvmf.a 00:02:22.751 CC module/event/subsystems/iscsi/iscsi.o 00:02:22.751 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:23.008 LIB libspdk_event_vhost_scsi.a 00:02:23.008 LIB libspdk_event_iscsi.a 00:02:23.268 TEST_HEADER include/spdk/accel.h 00:02:23.268 TEST_HEADER include/spdk/accel_module.h 00:02:23.268 TEST_HEADER include/spdk/barrier.h 00:02:23.268 TEST_HEADER include/spdk/base64.h 00:02:23.268 TEST_HEADER include/spdk/bdev.h 00:02:23.268 TEST_HEADER include/spdk/assert.h 00:02:23.268 TEST_HEADER include/spdk/bdev_module.h 00:02:23.268 TEST_HEADER include/spdk/bit_array.h 00:02:23.268 TEST_HEADER include/spdk/bit_pool.h 00:02:23.268 TEST_HEADER include/spdk/blob_bdev.h 00:02:23.268 TEST_HEADER include/spdk/bdev_zone.h 00:02:23.268 TEST_HEADER include/spdk/blobfs.h 00:02:23.268 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:23.268 TEST_HEADER include/spdk/blob.h 00:02:23.268 TEST_HEADER include/spdk/conf.h 00:02:23.268 TEST_HEADER include/spdk/config.h 00:02:23.268 TEST_HEADER include/spdk/cpuset.h 00:02:23.268 TEST_HEADER include/spdk/crc16.h 00:02:23.268 TEST_HEADER include/spdk/crc32.h 00:02:23.268 CC test/rpc_client/rpc_client_test.o 00:02:23.268 TEST_HEADER include/spdk/crc64.h 00:02:23.268 TEST_HEADER include/spdk/dif.h 00:02:23.268 TEST_HEADER include/spdk/dma.h 00:02:23.268 TEST_HEADER include/spdk/env_dpdk.h 00:02:23.268 TEST_HEADER include/spdk/endian.h 00:02:23.268 TEST_HEADER include/spdk/event.h 00:02:23.268 TEST_HEADER include/spdk/env.h 00:02:23.268 TEST_HEADER include/spdk/fd_group.h 00:02:23.268 CXX app/trace/trace.o 00:02:23.268 TEST_HEADER include/spdk/fd.h 00:02:23.268 TEST_HEADER include/spdk/ftl.h 00:02:23.268 TEST_HEADER include/spdk/file.h 00:02:23.268 CC app/spdk_top/spdk_top.o 00:02:23.268 TEST_HEADER include/spdk/hexlify.h 00:02:23.268 TEST_HEADER include/spdk/gpt_spec.h 00:02:23.268 TEST_HEADER include/spdk/histogram_data.h 00:02:23.268 TEST_HEADER include/spdk/idxd.h 00:02:23.268 TEST_HEADER include/spdk/idxd_spec.h 00:02:23.268 TEST_HEADER include/spdk/init.h 00:02:23.268 TEST_HEADER include/spdk/ioat.h 00:02:23.268 TEST_HEADER include/spdk/ioat_spec.h 00:02:23.268 CC app/spdk_nvme_perf/perf.o 00:02:23.268 TEST_HEADER include/spdk/iscsi_spec.h 00:02:23.268 TEST_HEADER include/spdk/json.h 00:02:23.268 CC app/spdk_lspci/spdk_lspci.o 00:02:23.268 TEST_HEADER include/spdk/jsonrpc.h 00:02:23.268 TEST_HEADER include/spdk/likely.h 00:02:23.268 TEST_HEADER include/spdk/log.h 00:02:23.268 CC app/trace_record/trace_record.o 00:02:23.268 TEST_HEADER include/spdk/lvol.h 00:02:23.268 TEST_HEADER include/spdk/memory.h 00:02:23.268 TEST_HEADER include/spdk/mmio.h 00:02:23.268 TEST_HEADER include/spdk/nbd.h 00:02:23.268 TEST_HEADER include/spdk/notify.h 00:02:23.268 TEST_HEADER include/spdk/nvme.h 00:02:23.268 CC app/spdk_nvme_discover/discovery_aer.o 00:02:23.268 TEST_HEADER include/spdk/nvme_intel.h 00:02:23.268 CC app/spdk_nvme_identify/identify.o 00:02:23.268 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:23.268 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:23.268 TEST_HEADER include/spdk/nvme_spec.h 00:02:23.268 TEST_HEADER include/spdk/nvme_zns.h 00:02:23.268 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:23.268 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:23.268 TEST_HEADER include/spdk/nvmf.h 00:02:23.268 TEST_HEADER include/spdk/nvmf_spec.h 00:02:23.268 TEST_HEADER include/spdk/nvmf_transport.h 00:02:23.268 TEST_HEADER include/spdk/opal.h 00:02:23.268 TEST_HEADER include/spdk/opal_spec.h 00:02:23.268 TEST_HEADER include/spdk/pci_ids.h 00:02:23.268 TEST_HEADER include/spdk/pipe.h 00:02:23.268 TEST_HEADER include/spdk/queue.h 00:02:23.268 TEST_HEADER include/spdk/reduce.h 00:02:23.268 TEST_HEADER include/spdk/rpc.h 00:02:23.268 TEST_HEADER include/spdk/scheduler.h 00:02:23.268 TEST_HEADER include/spdk/scsi.h 00:02:23.268 TEST_HEADER include/spdk/scsi_spec.h 00:02:23.268 TEST_HEADER include/spdk/sock.h 00:02:23.268 TEST_HEADER include/spdk/stdinc.h 00:02:23.268 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:23.268 TEST_HEADER include/spdk/thread.h 00:02:23.268 TEST_HEADER include/spdk/string.h 00:02:23.268 TEST_HEADER include/spdk/trace.h 00:02:23.268 TEST_HEADER include/spdk/trace_parser.h 00:02:23.268 CC app/vhost/vhost.o 00:02:23.268 TEST_HEADER include/spdk/tree.h 00:02:23.268 TEST_HEADER include/spdk/ublk.h 00:02:23.268 TEST_HEADER include/spdk/util.h 00:02:23.268 TEST_HEADER include/spdk/uuid.h 00:02:23.268 TEST_HEADER include/spdk/version.h 00:02:23.268 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:23.268 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:23.268 TEST_HEADER include/spdk/vhost.h 00:02:23.268 TEST_HEADER include/spdk/vmd.h 00:02:23.268 TEST_HEADER include/spdk/xor.h 00:02:23.268 TEST_HEADER include/spdk/zipf.h 00:02:23.268 CXX test/cpp_headers/accel.o 00:02:23.268 CXX test/cpp_headers/accel_module.o 00:02:23.268 CXX test/cpp_headers/assert.o 00:02:23.268 CC app/iscsi_tgt/iscsi_tgt.o 00:02:23.268 CXX test/cpp_headers/barrier.o 00:02:23.268 CXX test/cpp_headers/bdev.o 00:02:23.268 CXX test/cpp_headers/base64.o 00:02:23.268 CXX test/cpp_headers/bdev_module.o 00:02:23.268 CXX test/cpp_headers/bdev_zone.o 00:02:23.268 CC app/nvmf_tgt/nvmf_main.o 00:02:23.268 CXX test/cpp_headers/bit_array.o 00:02:23.268 CXX test/cpp_headers/bit_pool.o 00:02:23.268 CXX test/cpp_headers/blob_bdev.o 00:02:23.268 CXX test/cpp_headers/blobfs_bdev.o 00:02:23.268 CXX test/cpp_headers/blobfs.o 00:02:23.268 CXX test/cpp_headers/conf.o 00:02:23.268 CXX test/cpp_headers/blob.o 00:02:23.268 CXX test/cpp_headers/cpuset.o 00:02:23.268 CXX test/cpp_headers/config.o 00:02:23.268 CXX test/cpp_headers/crc16.o 00:02:23.268 CXX test/cpp_headers/crc32.o 00:02:23.268 CXX test/cpp_headers/crc64.o 00:02:23.268 CXX test/cpp_headers/dif.o 00:02:23.268 CC app/spdk_dd/spdk_dd.o 00:02:23.268 CXX test/cpp_headers/dma.o 00:02:23.268 CXX test/cpp_headers/endian.o 00:02:23.268 CXX test/cpp_headers/env_dpdk.o 00:02:23.268 CXX test/cpp_headers/env.o 00:02:23.268 CXX test/cpp_headers/event.o 00:02:23.268 CXX test/cpp_headers/fd_group.o 00:02:23.268 CXX test/cpp_headers/fd.o 00:02:23.268 CXX test/cpp_headers/file.o 00:02:23.268 CXX test/cpp_headers/ftl.o 00:02:23.268 CXX test/cpp_headers/gpt_spec.o 00:02:23.268 CXX test/cpp_headers/hexlify.o 00:02:23.268 CXX test/cpp_headers/histogram_data.o 00:02:23.268 CXX test/cpp_headers/idxd.o 00:02:23.268 CXX test/cpp_headers/idxd_spec.o 00:02:23.268 CXX test/cpp_headers/init.o 00:02:23.533 CC test/app/jsoncat/jsoncat.o 00:02:23.533 CC test/app/histogram_perf/histogram_perf.o 00:02:23.533 CC app/spdk_tgt/spdk_tgt.o 00:02:23.533 CC test/env/memory/memory_ut.o 00:02:23.533 CC test/app/stub/stub.o 00:02:23.533 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:23.533 CC test/nvme/aer/aer.o 00:02:23.533 CC test/env/pci/pci_ut.o 00:02:23.533 CC test/env/vtophys/vtophys.o 00:02:23.533 CC test/nvme/err_injection/err_injection.o 00:02:23.533 CC test/nvme/reset/reset.o 00:02:23.533 CC test/thread/poller_perf/poller_perf.o 00:02:23.533 CC test/nvme/startup/startup.o 00:02:23.533 CC test/nvme/cuse/cuse.o 00:02:23.533 CC test/nvme/simple_copy/simple_copy.o 00:02:23.533 CC test/nvme/e2edp/nvme_dp.o 00:02:23.533 CC test/nvme/sgl/sgl.o 00:02:23.533 CXX test/cpp_headers/ioat.o 00:02:23.533 CC test/nvme/fused_ordering/fused_ordering.o 00:02:23.533 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:23.533 CC test/nvme/overhead/overhead.o 00:02:23.533 CC test/nvme/connect_stress/connect_stress.o 00:02:23.533 CC test/event/reactor/reactor.o 00:02:23.533 CC test/nvme/boot_partition/boot_partition.o 00:02:23.533 CC test/nvme/reserve/reserve.o 00:02:23.533 CC test/nvme/compliance/nvme_compliance.o 00:02:23.533 CC test/thread/lock/spdk_lock.o 00:02:23.533 CC test/event/reactor_perf/reactor_perf.o 00:02:23.533 CC test/accel/dif/dif.o 00:02:23.533 CC test/event/event_perf/event_perf.o 00:02:23.533 CC test/nvme/fdp/fdp.o 00:02:23.533 CC examples/sock/hello_world/hello_sock.o 00:02:23.533 CC examples/idxd/perf/perf.o 00:02:23.533 CC examples/ioat/perf/perf.o 00:02:23.533 CC examples/ioat/verify/verify.o 00:02:23.533 CC test/event/app_repeat/app_repeat.o 00:02:23.533 CC test/app/bdev_svc/bdev_svc.o 00:02:23.533 CC examples/nvme/hello_world/hello_world.o 00:02:23.533 CC examples/nvme/abort/abort.o 00:02:23.533 CC examples/nvme/arbitration/arbitration.o 00:02:23.533 CC app/fio/nvme/fio_plugin.o 00:02:23.533 CC examples/vmd/lsvmd/lsvmd.o 00:02:23.533 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:23.533 CC examples/vmd/led/led.o 00:02:23.533 CC examples/nvme/hotplug/hotplug.o 00:02:23.533 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:23.533 CC examples/accel/perf/accel_perf.o 00:02:23.533 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:23.533 CC examples/nvme/reconnect/reconnect.o 00:02:23.533 CC examples/util/zipf/zipf.o 00:02:23.533 CC test/dma/test_dma/test_dma.o 00:02:23.533 CC examples/nvmf/nvmf/nvmf.o 00:02:23.533 CC test/blobfs/mkfs/mkfs.o 00:02:23.533 LINK spdk_lspci 00:02:23.533 CC examples/blob/hello_world/hello_blob.o 00:02:23.533 CC test/bdev/bdevio/bdevio.o 00:02:23.533 CC test/event/scheduler/scheduler.o 00:02:23.533 CC examples/bdev/hello_world/hello_bdev.o 00:02:23.533 CC examples/blob/cli/blobcli.o 00:02:23.533 CC examples/bdev/bdevperf/bdevperf.o 00:02:23.533 CC examples/thread/thread/thread_ex.o 00:02:23.533 CC test/env/mem_callbacks/mem_callbacks.o 00:02:23.533 CC app/fio/bdev/fio_plugin.o 00:02:23.533 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:23.533 LINK rpc_client_test 00:02:23.533 CC test/lvol/esnap/esnap.o 00:02:23.533 LINK spdk_nvme_discover 00:02:23.533 CXX test/cpp_headers/ioat_spec.o 00:02:23.533 LINK jsoncat 00:02:23.533 CXX test/cpp_headers/iscsi_spec.o 00:02:23.533 CXX test/cpp_headers/json.o 00:02:23.533 LINK histogram_perf 00:02:23.533 CXX test/cpp_headers/jsonrpc.o 00:02:23.533 CXX test/cpp_headers/likely.o 00:02:23.533 LINK interrupt_tgt 00:02:23.533 CXX test/cpp_headers/log.o 00:02:23.533 CXX test/cpp_headers/lvol.o 00:02:23.533 CXX test/cpp_headers/memory.o 00:02:23.533 CXX test/cpp_headers/mmio.o 00:02:23.533 CXX test/cpp_headers/nbd.o 00:02:23.533 CXX test/cpp_headers/notify.o 00:02:23.533 CXX test/cpp_headers/nvme.o 00:02:23.533 CXX test/cpp_headers/nvme_intel.o 00:02:23.533 LINK env_dpdk_post_init 00:02:23.533 CXX test/cpp_headers/nvme_ocssd.o 00:02:23.533 LINK spdk_trace_record 00:02:23.533 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:23.533 CXX test/cpp_headers/nvme_spec.o 00:02:23.533 CXX test/cpp_headers/nvme_zns.o 00:02:23.533 LINK vhost 00:02:23.533 CXX test/cpp_headers/nvmf_cmd.o 00:02:23.533 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:23.533 LINK nvmf_tgt 00:02:23.533 CXX test/cpp_headers/nvmf.o 00:02:23.533 LINK vtophys 00:02:23.533 CXX test/cpp_headers/nvmf_spec.o 00:02:23.533 CXX test/cpp_headers/nvmf_transport.o 00:02:23.533 CXX test/cpp_headers/opal.o 00:02:23.533 LINK reactor 00:02:23.533 CXX test/cpp_headers/opal_spec.o 00:02:23.533 LINK poller_perf 00:02:23.533 CXX test/cpp_headers/pci_ids.o 00:02:23.533 CXX test/cpp_headers/pipe.o 00:02:23.533 CXX test/cpp_headers/queue.o 00:02:23.533 CXX test/cpp_headers/reduce.o 00:02:23.796 LINK lsvmd 00:02:23.796 LINK reactor_perf 00:02:23.796 LINK event_perf 00:02:23.796 LINK stub 00:02:23.796 CXX test/cpp_headers/rpc.o 00:02:23.796 LINK iscsi_tgt 00:02:23.796 CXX test/cpp_headers/scheduler.o 00:02:23.796 LINK app_repeat 00:02:23.796 CXX test/cpp_headers/scsi.o 00:02:23.796 LINK zipf 00:02:23.797 CXX test/cpp_headers/scsi_spec.o 00:02:23.797 LINK led 00:02:23.797 LINK boot_partition 00:02:23.797 LINK startup 00:02:23.797 LINK connect_stress 00:02:23.797 LINK err_injection 00:02:23.797 LINK doorbell_aers 00:02:23.797 CXX test/cpp_headers/sock.o 00:02:23.797 LINK pmr_persistence 00:02:23.797 LINK reserve 00:02:23.797 LINK spdk_tgt 00:02:23.797 LINK bdev_svc 00:02:23.797 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:23.797 struct spdk_nvme_fdp_ruhs ruhs; 00:02:23.797 ^ 00:02:23.797 CXX test/cpp_headers/stdinc.o 00:02:23.797 LINK cmb_copy 00:02:23.797 LINK fused_ordering 00:02:23.797 LINK ioat_perf 00:02:23.797 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:23.797 LINK reset 00:02:23.797 LINK verify 00:02:23.797 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:23.797 LINK hotplug 00:02:23.797 LINK simple_copy 00:02:23.797 LINK hello_sock 00:02:23.797 LINK nvme_dp 00:02:23.797 LINK hello_world 00:02:23.797 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:23.797 LINK mkfs 00:02:23.797 LINK fdp 00:02:23.797 LINK scheduler 00:02:23.797 LINK aer 00:02:23.797 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:23.797 LINK sgl 00:02:23.797 LINK overhead 00:02:23.797 LINK spdk_trace 00:02:23.797 CXX test/cpp_headers/string.o 00:02:23.797 CXX test/cpp_headers/thread.o 00:02:23.797 LINK hello_blob 00:02:23.797 CXX test/cpp_headers/trace.o 00:02:23.797 CXX test/cpp_headers/trace_parser.o 00:02:23.797 LINK thread 00:02:23.797 CXX test/cpp_headers/tree.o 00:02:23.797 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:23.797 LINK hello_bdev 00:02:23.797 CXX test/cpp_headers/ublk.o 00:02:23.797 CXX test/cpp_headers/util.o 00:02:23.797 CXX test/cpp_headers/uuid.o 00:02:23.797 CXX test/cpp_headers/version.o 00:02:23.797 CXX test/cpp_headers/vfio_user_pci.o 00:02:23.797 CXX test/cpp_headers/vfio_user_spec.o 00:02:23.797 LINK idxd_perf 00:02:23.797 CXX test/cpp_headers/vhost.o 00:02:23.797 CXX test/cpp_headers/vmd.o 00:02:23.797 CXX test/cpp_headers/xor.o 00:02:23.797 CXX test/cpp_headers/zipf.o 00:02:23.797 LINK reconnect 00:02:23.797 LINK nvmf 00:02:24.076 LINK dif 00:02:24.076 LINK test_dma 00:02:24.076 LINK abort 00:02:24.076 LINK nvme_compliance 00:02:24.076 LINK arbitration 00:02:24.076 LINK spdk_dd 00:02:24.076 LINK bdevio 00:02:24.076 LINK nvme_manage 00:02:24.076 LINK pci_ut 00:02:24.076 LINK nvme_fuzz 00:02:24.076 1 warning generated. 00:02:24.076 LINK mem_callbacks 00:02:24.076 LINK accel_perf 00:02:24.335 LINK spdk_nvme 00:02:24.335 LINK blobcli 00:02:24.335 LINK llvm_vfio_fuzz 00:02:24.335 LINK vhost_fuzz 00:02:24.335 LINK spdk_bdev 00:02:24.335 LINK spdk_nvme_identify 00:02:24.335 LINK memory_ut 00:02:24.335 LINK spdk_nvme_perf 00:02:24.592 LINK spdk_top 00:02:24.592 LINK bdevperf 00:02:24.593 LINK llvm_nvme_fuzz 00:02:24.849 LINK cuse 00:02:24.849 LINK spdk_lock 00:02:25.415 LINK iscsi_fuzz 00:02:27.317 LINK esnap 00:02:27.575 00:02:27.575 real 0m23.908s 00:02:27.575 user 4m36.473s 00:02:27.575 sys 1m53.222s 00:02:27.575 11:25:56 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:27.575 11:25:56 -- common/autotest_common.sh@10 -- $ set +x 00:02:27.575 ************************************ 00:02:27.575 END TEST make 00:02:27.575 ************************************ 00:02:27.834 11:25:57 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:27.834 11:25:57 -- nvmf/common.sh@7 -- # uname -s 00:02:27.834 11:25:57 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:27.834 11:25:57 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:27.834 11:25:57 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:27.834 11:25:57 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:27.834 11:25:57 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:27.834 11:25:57 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:27.834 11:25:57 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:27.834 11:25:57 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:27.834 11:25:57 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:27.834 11:25:57 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:27.834 11:25:57 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:27.834 11:25:57 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:27.834 11:25:57 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:27.834 11:25:57 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:27.834 11:25:57 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:27.834 11:25:57 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:27.834 11:25:57 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:27.834 11:25:57 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:27.834 11:25:57 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:27.834 11:25:57 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:27.834 11:25:57 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:27.834 11:25:57 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:27.834 11:25:57 -- paths/export.sh@5 -- # export PATH 00:02:27.834 11:25:57 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:27.834 11:25:57 -- nvmf/common.sh@46 -- # : 0 00:02:27.835 11:25:57 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:27.835 11:25:57 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:27.835 11:25:57 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:27.835 11:25:57 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:27.835 11:25:57 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:27.835 11:25:57 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:27.835 11:25:57 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:27.835 11:25:57 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:27.835 11:25:57 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:27.835 11:25:57 -- spdk/autotest.sh@32 -- # uname -s 00:02:27.835 11:25:57 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:27.835 11:25:57 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:27.835 11:25:57 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:27.835 11:25:57 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:27.835 11:25:57 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:27.835 11:25:57 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:27.835 11:25:57 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:27.835 11:25:57 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:27.835 11:25:57 -- spdk/autotest.sh@48 -- # udevadm_pid=1989952 00:02:27.835 11:25:57 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:27.835 11:25:57 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:27.835 11:25:57 -- spdk/autotest.sh@54 -- # echo 1989954 00:02:27.835 11:25:57 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:27.835 11:25:57 -- spdk/autotest.sh@56 -- # echo 1989955 00:02:27.835 11:25:57 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:27.835 11:25:57 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:27.835 11:25:57 -- spdk/autotest.sh@60 -- # echo 1989956 00:02:27.835 11:25:57 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:27.835 11:25:57 -- spdk/autotest.sh@62 -- # echo 1989957 00:02:27.835 11:25:57 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:27.835 11:25:57 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:27.835 11:25:57 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:27.835 11:25:57 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:27.835 11:25:57 -- common/autotest_common.sh@10 -- # set +x 00:02:27.835 11:25:57 -- spdk/autotest.sh@70 -- # create_test_list 00:02:27.835 11:25:57 -- common/autotest_common.sh@736 -- # xtrace_disable 00:02:27.835 11:25:57 -- common/autotest_common.sh@10 -- # set +x 00:02:27.835 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:27.835 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:27.835 11:25:57 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:27.835 11:25:57 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:27.835 11:25:57 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:27.835 11:25:57 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:27.835 11:25:57 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:27.835 11:25:57 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:27.835 11:25:57 -- common/autotest_common.sh@1440 -- # uname 00:02:27.835 11:25:57 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:02:27.835 11:25:57 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:27.835 11:25:57 -- common/autotest_common.sh@1460 -- # uname 00:02:27.835 11:25:57 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:02:27.835 11:25:57 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:02:27.835 11:25:57 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:02:27.835 11:25:57 -- spdk/autotest.sh@83 -- # hash lcov 00:02:27.835 11:25:57 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:27.835 11:25:57 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:02:27.835 11:25:57 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:27.835 11:25:57 -- common/autotest_common.sh@10 -- # set +x 00:02:27.835 11:25:57 -- spdk/autotest.sh@102 -- # rm -f 00:02:27.835 11:25:57 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:31.125 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:31.385 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:31.643 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:02:31.643 11:26:01 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:02:31.643 11:26:01 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:02:31.643 11:26:01 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:02:31.643 11:26:01 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:02:31.643 11:26:01 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:02:31.643 11:26:01 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:02:31.643 11:26:01 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:02:31.643 11:26:01 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:31.643 11:26:01 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:02:31.643 11:26:01 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:02:31.643 11:26:01 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:02:31.643 11:26:01 -- spdk/autotest.sh@121 -- # grep -v p 00:02:31.643 11:26:01 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:02:31.643 11:26:01 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:02:31.643 11:26:01 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:02:31.643 11:26:01 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:02:31.643 11:26:01 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:31.902 No valid GPT data, bailing 00:02:31.902 11:26:01 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:31.902 11:26:01 -- scripts/common.sh@393 -- # pt= 00:02:31.902 11:26:01 -- scripts/common.sh@394 -- # return 1 00:02:31.902 11:26:01 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:31.902 1+0 records in 00:02:31.902 1+0 records out 00:02:31.902 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0049286 s, 213 MB/s 00:02:31.902 11:26:01 -- spdk/autotest.sh@129 -- # sync 00:02:31.902 11:26:01 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:31.902 11:26:01 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:31.902 11:26:01 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:40.020 11:26:08 -- spdk/autotest.sh@135 -- # uname -s 00:02:40.020 11:26:08 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:02:40.020 11:26:08 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:40.020 11:26:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:02:40.020 11:26:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:02:40.020 11:26:08 -- common/autotest_common.sh@10 -- # set +x 00:02:40.020 ************************************ 00:02:40.020 START TEST setup.sh 00:02:40.020 ************************************ 00:02:40.020 11:26:08 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:40.020 * Looking for test storage... 00:02:40.020 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:40.020 11:26:08 -- setup/test-setup.sh@10 -- # uname -s 00:02:40.020 11:26:08 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:40.020 11:26:08 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:40.020 11:26:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:02:40.020 11:26:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:02:40.020 11:26:08 -- common/autotest_common.sh@10 -- # set +x 00:02:40.020 ************************************ 00:02:40.020 START TEST acl 00:02:40.020 ************************************ 00:02:40.020 11:26:08 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:40.020 * Looking for test storage... 00:02:40.020 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:40.020 11:26:08 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:40.020 11:26:08 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:02:40.020 11:26:08 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:02:40.020 11:26:08 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:02:40.020 11:26:08 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:02:40.020 11:26:08 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:02:40.020 11:26:08 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:02:40.020 11:26:08 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:40.020 11:26:08 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:02:40.020 11:26:08 -- setup/acl.sh@12 -- # devs=() 00:02:40.020 11:26:08 -- setup/acl.sh@12 -- # declare -a devs 00:02:40.020 11:26:08 -- setup/acl.sh@13 -- # drivers=() 00:02:40.020 11:26:08 -- setup/acl.sh@13 -- # declare -A drivers 00:02:40.020 11:26:08 -- setup/acl.sh@51 -- # setup reset 00:02:40.020 11:26:08 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:40.020 11:26:08 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:42.554 11:26:11 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:42.554 11:26:11 -- setup/acl.sh@16 -- # local dev driver 00:02:42.554 11:26:11 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.554 11:26:11 -- setup/acl.sh@15 -- # setup output status 00:02:42.554 11:26:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:42.554 11:26:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:45.846 Hugepages 00:02:45.846 node hugesize free / total 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 00:02:45.846 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.846 11:26:14 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:45.846 11:26:14 -- setup/acl.sh@20 -- # continue 00:02:45.846 11:26:14 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.847 11:26:15 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:02:45.847 11:26:15 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:45.847 11:26:15 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:02:45.847 11:26:15 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:45.847 11:26:15 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:45.847 11:26:15 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:45.847 11:26:15 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:45.847 11:26:15 -- setup/acl.sh@54 -- # run_test denied denied 00:02:45.847 11:26:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:02:45.847 11:26:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:02:45.847 11:26:15 -- common/autotest_common.sh@10 -- # set +x 00:02:45.847 ************************************ 00:02:45.847 START TEST denied 00:02:45.847 ************************************ 00:02:45.847 11:26:15 -- common/autotest_common.sh@1104 -- # denied 00:02:45.847 11:26:15 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:02:45.847 11:26:15 -- setup/acl.sh@38 -- # setup output config 00:02:45.847 11:26:15 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:02:45.847 11:26:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:45.847 11:26:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:50.038 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:02:50.038 11:26:18 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:02:50.038 11:26:18 -- setup/acl.sh@28 -- # local dev driver 00:02:50.038 11:26:18 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:50.038 11:26:18 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:02:50.038 11:26:18 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:02:50.038 11:26:18 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:50.038 11:26:18 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:50.038 11:26:18 -- setup/acl.sh@41 -- # setup reset 00:02:50.038 11:26:18 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:50.038 11:26:18 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:54.226 00:02:54.226 real 0m7.922s 00:02:54.226 user 0m2.339s 00:02:54.226 sys 0m4.873s 00:02:54.226 11:26:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:02:54.226 11:26:22 -- common/autotest_common.sh@10 -- # set +x 00:02:54.226 ************************************ 00:02:54.226 END TEST denied 00:02:54.226 ************************************ 00:02:54.226 11:26:23 -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:54.226 11:26:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:02:54.226 11:26:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:02:54.226 11:26:23 -- common/autotest_common.sh@10 -- # set +x 00:02:54.226 ************************************ 00:02:54.226 START TEST allowed 00:02:54.226 ************************************ 00:02:54.226 11:26:23 -- common/autotest_common.sh@1104 -- # allowed 00:02:54.226 11:26:23 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:02:54.226 11:26:23 -- setup/acl.sh@45 -- # setup output config 00:02:54.226 11:26:23 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:02:54.226 11:26:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:54.226 11:26:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:59.505 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:02:59.505 11:26:27 -- setup/acl.sh@47 -- # verify 00:02:59.505 11:26:27 -- setup/acl.sh@28 -- # local dev driver 00:02:59.505 11:26:27 -- setup/acl.sh@48 -- # setup reset 00:02:59.505 11:26:27 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:59.505 11:26:27 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:02.797 00:03:02.797 real 0m8.643s 00:03:02.797 user 0m2.408s 00:03:02.797 sys 0m4.834s 00:03:02.797 11:26:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:02.797 11:26:31 -- common/autotest_common.sh@10 -- # set +x 00:03:02.797 ************************************ 00:03:02.797 END TEST allowed 00:03:02.797 ************************************ 00:03:02.797 00:03:02.797 real 0m23.527s 00:03:02.797 user 0m7.123s 00:03:02.797 sys 0m14.556s 00:03:02.797 11:26:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:02.797 11:26:31 -- common/autotest_common.sh@10 -- # set +x 00:03:02.797 ************************************ 00:03:02.797 END TEST acl 00:03:02.797 ************************************ 00:03:02.797 11:26:31 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:02.797 11:26:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:02.797 11:26:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:02.797 11:26:31 -- common/autotest_common.sh@10 -- # set +x 00:03:02.797 ************************************ 00:03:02.797 START TEST hugepages 00:03:02.797 ************************************ 00:03:02.797 11:26:31 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:02.797 * Looking for test storage... 00:03:02.797 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:02.797 11:26:31 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:02.797 11:26:31 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:02.797 11:26:31 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:02.797 11:26:31 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:02.797 11:26:31 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:02.797 11:26:31 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:02.797 11:26:31 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:02.797 11:26:31 -- setup/common.sh@18 -- # local node= 00:03:02.797 11:26:31 -- setup/common.sh@19 -- # local var val 00:03:02.797 11:26:31 -- setup/common.sh@20 -- # local mem_f mem 00:03:02.797 11:26:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:02.797 11:26:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:02.797 11:26:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:02.798 11:26:31 -- setup/common.sh@28 -- # mapfile -t mem 00:03:02.798 11:26:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 35498412 kB' 'MemAvailable: 37430904 kB' 'Buffers: 2704 kB' 'Cached: 15987544 kB' 'SwapCached: 28 kB' 'Active: 15103092 kB' 'Inactive: 1501480 kB' 'Active(anon): 14572276 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617640 kB' 'Mapped: 214716 kB' 'Shmem: 13990164 kB' 'KReclaimable: 580104 kB' 'Slab: 1228260 kB' 'SReclaimable: 580104 kB' 'SUnreclaim: 648156 kB' 'KernelStack: 21984 kB' 'PageTables: 9080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 16021820 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216260 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.798 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.798 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # continue 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # IFS=': ' 00:03:02.799 11:26:31 -- setup/common.sh@31 -- # read -r var val _ 00:03:02.799 11:26:31 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:02.799 11:26:31 -- setup/common.sh@33 -- # echo 2048 00:03:02.799 11:26:31 -- setup/common.sh@33 -- # return 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:02.799 11:26:31 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:02.799 11:26:31 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:02.799 11:26:31 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:02.799 11:26:31 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:02.799 11:26:31 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:02.799 11:26:31 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:02.799 11:26:31 -- setup/hugepages.sh@207 -- # get_nodes 00:03:02.799 11:26:31 -- setup/hugepages.sh@27 -- # local node 00:03:02.799 11:26:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:02.799 11:26:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:02.799 11:26:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:02.799 11:26:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:02.799 11:26:31 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:02.799 11:26:31 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:02.799 11:26:31 -- setup/hugepages.sh@208 -- # clear_hp 00:03:02.799 11:26:31 -- setup/hugepages.sh@37 -- # local node hp 00:03:02.799 11:26:31 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:02.799 11:26:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:02.799 11:26:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:02.799 11:26:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:02.799 11:26:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:02.799 11:26:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:02.799 11:26:31 -- setup/hugepages.sh@41 -- # echo 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:02.799 11:26:31 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:02.799 11:26:31 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:02.799 11:26:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:02.799 11:26:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:02.799 11:26:31 -- common/autotest_common.sh@10 -- # set +x 00:03:02.799 ************************************ 00:03:02.799 START TEST default_setup 00:03:02.799 ************************************ 00:03:02.799 11:26:31 -- common/autotest_common.sh@1104 -- # default_setup 00:03:02.799 11:26:31 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:02.799 11:26:31 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:02.799 11:26:31 -- setup/hugepages.sh@51 -- # shift 00:03:02.799 11:26:31 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:02.799 11:26:31 -- setup/hugepages.sh@52 -- # local node_ids 00:03:02.799 11:26:31 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:02.799 11:26:31 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:02.799 11:26:31 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:02.799 11:26:31 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:02.799 11:26:31 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:02.799 11:26:31 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:02.799 11:26:31 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:02.799 11:26:31 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:02.799 11:26:31 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:02.799 11:26:31 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:02.799 11:26:31 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:02.799 11:26:31 -- setup/hugepages.sh@73 -- # return 0 00:03:02.799 11:26:31 -- setup/hugepages.sh@137 -- # setup output 00:03:02.799 11:26:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:02.799 11:26:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:06.086 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:06.086 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:07.466 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:07.466 11:26:36 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:07.466 11:26:36 -- setup/hugepages.sh@89 -- # local node 00:03:07.466 11:26:36 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:07.466 11:26:36 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:07.466 11:26:36 -- setup/hugepages.sh@92 -- # local surp 00:03:07.466 11:26:36 -- setup/hugepages.sh@93 -- # local resv 00:03:07.466 11:26:36 -- setup/hugepages.sh@94 -- # local anon 00:03:07.466 11:26:36 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:07.466 11:26:36 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:07.466 11:26:36 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:07.466 11:26:36 -- setup/common.sh@18 -- # local node= 00:03:07.466 11:26:36 -- setup/common.sh@19 -- # local var val 00:03:07.466 11:26:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:07.466 11:26:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.466 11:26:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.466 11:26:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.466 11:26:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.466 11:26:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37734764 kB' 'MemAvailable: 39667232 kB' 'Buffers: 2704 kB' 'Cached: 15987664 kB' 'SwapCached: 28 kB' 'Active: 15118004 kB' 'Inactive: 1501480 kB' 'Active(anon): 14587188 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632044 kB' 'Mapped: 215016 kB' 'Shmem: 13990284 kB' 'KReclaimable: 580080 kB' 'Slab: 1226444 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646364 kB' 'KernelStack: 21936 kB' 'PageTables: 8756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16036916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.466 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.466 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.467 11:26:36 -- setup/common.sh@33 -- # echo 0 00:03:07.467 11:26:36 -- setup/common.sh@33 -- # return 0 00:03:07.467 11:26:36 -- setup/hugepages.sh@97 -- # anon=0 00:03:07.467 11:26:36 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:07.467 11:26:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:07.467 11:26:36 -- setup/common.sh@18 -- # local node= 00:03:07.467 11:26:36 -- setup/common.sh@19 -- # local var val 00:03:07.467 11:26:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:07.467 11:26:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.467 11:26:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.467 11:26:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.467 11:26:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.467 11:26:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37753728 kB' 'MemAvailable: 39686196 kB' 'Buffers: 2704 kB' 'Cached: 15987668 kB' 'SwapCached: 28 kB' 'Active: 15118996 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588180 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633172 kB' 'Mapped: 215016 kB' 'Shmem: 13990288 kB' 'KReclaimable: 580080 kB' 'Slab: 1226472 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646392 kB' 'KernelStack: 22016 kB' 'PageTables: 9284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16036928 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216564 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.467 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.467 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.468 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.468 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.468 11:26:36 -- setup/common.sh@33 -- # echo 0 00:03:07.468 11:26:36 -- setup/common.sh@33 -- # return 0 00:03:07.468 11:26:36 -- setup/hugepages.sh@99 -- # surp=0 00:03:07.468 11:26:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:07.468 11:26:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:07.468 11:26:36 -- setup/common.sh@18 -- # local node= 00:03:07.468 11:26:36 -- setup/common.sh@19 -- # local var val 00:03:07.468 11:26:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:07.468 11:26:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.468 11:26:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.468 11:26:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.468 11:26:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.468 11:26:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.469 11:26:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37753540 kB' 'MemAvailable: 39686008 kB' 'Buffers: 2704 kB' 'Cached: 15987680 kB' 'SwapCached: 28 kB' 'Active: 15118208 kB' 'Inactive: 1501480 kB' 'Active(anon): 14587392 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632752 kB' 'Mapped: 214940 kB' 'Shmem: 13990300 kB' 'KReclaimable: 580080 kB' 'Slab: 1226480 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646400 kB' 'KernelStack: 22272 kB' 'PageTables: 9644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16036944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216628 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.469 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.469 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.470 11:26:36 -- setup/common.sh@33 -- # echo 0 00:03:07.470 11:26:36 -- setup/common.sh@33 -- # return 0 00:03:07.470 11:26:36 -- setup/hugepages.sh@100 -- # resv=0 00:03:07.470 11:26:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:07.470 nr_hugepages=1024 00:03:07.470 11:26:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:07.470 resv_hugepages=0 00:03:07.470 11:26:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:07.470 surplus_hugepages=0 00:03:07.470 11:26:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:07.470 anon_hugepages=0 00:03:07.470 11:26:36 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:07.470 11:26:36 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:07.470 11:26:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:07.470 11:26:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:07.470 11:26:36 -- setup/common.sh@18 -- # local node= 00:03:07.470 11:26:36 -- setup/common.sh@19 -- # local var val 00:03:07.470 11:26:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:07.470 11:26:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.470 11:26:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.470 11:26:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.470 11:26:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.470 11:26:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37754960 kB' 'MemAvailable: 39687428 kB' 'Buffers: 2704 kB' 'Cached: 15987692 kB' 'SwapCached: 28 kB' 'Active: 15117260 kB' 'Inactive: 1501480 kB' 'Active(anon): 14586444 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631572 kB' 'Mapped: 214940 kB' 'Shmem: 13990312 kB' 'KReclaimable: 580080 kB' 'Slab: 1226968 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646888 kB' 'KernelStack: 22224 kB' 'PageTables: 9428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16036780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216644 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.470 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.470 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.471 11:26:36 -- setup/common.sh@33 -- # echo 1024 00:03:07.471 11:26:36 -- setup/common.sh@33 -- # return 0 00:03:07.471 11:26:36 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:07.471 11:26:36 -- setup/hugepages.sh@112 -- # get_nodes 00:03:07.471 11:26:36 -- setup/hugepages.sh@27 -- # local node 00:03:07.471 11:26:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:07.471 11:26:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:07.471 11:26:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:07.471 11:26:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:07.471 11:26:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:07.471 11:26:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:07.471 11:26:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:07.471 11:26:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:07.471 11:26:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:07.471 11:26:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:07.471 11:26:36 -- setup/common.sh@18 -- # local node=0 00:03:07.471 11:26:36 -- setup/common.sh@19 -- # local var val 00:03:07.471 11:26:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:07.471 11:26:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.471 11:26:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:07.471 11:26:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:07.471 11:26:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.471 11:26:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 22852700 kB' 'MemUsed: 9739384 kB' 'SwapCached: 16 kB' 'Active: 5870172 kB' 'Inactive: 349704 kB' 'Active(anon): 5615084 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833368 kB' 'Mapped: 102324 kB' 'AnonPages: 389596 kB' 'Shmem: 5228576 kB' 'KernelStack: 12280 kB' 'PageTables: 5444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 634532 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 287388 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.471 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.471 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # continue 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:07.472 11:26:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:07.472 11:26:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.472 11:26:36 -- setup/common.sh@33 -- # echo 0 00:03:07.472 11:26:36 -- setup/common.sh@33 -- # return 0 00:03:07.472 11:26:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:07.472 11:26:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:07.472 11:26:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:07.472 11:26:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:07.472 11:26:36 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:07.472 node0=1024 expecting 1024 00:03:07.472 11:26:36 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:07.472 00:03:07.472 real 0m4.849s 00:03:07.472 user 0m1.176s 00:03:07.472 sys 0m2.216s 00:03:07.472 11:26:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:07.472 11:26:36 -- common/autotest_common.sh@10 -- # set +x 00:03:07.472 ************************************ 00:03:07.472 END TEST default_setup 00:03:07.472 ************************************ 00:03:07.472 11:26:36 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:07.472 11:26:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:07.472 11:26:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:07.472 11:26:36 -- common/autotest_common.sh@10 -- # set +x 00:03:07.472 ************************************ 00:03:07.472 START TEST per_node_1G_alloc 00:03:07.472 ************************************ 00:03:07.472 11:26:36 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:07.472 11:26:36 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:07.472 11:26:36 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:07.472 11:26:36 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:07.472 11:26:36 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:07.472 11:26:36 -- setup/hugepages.sh@51 -- # shift 00:03:07.472 11:26:36 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:07.472 11:26:36 -- setup/hugepages.sh@52 -- # local node_ids 00:03:07.473 11:26:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:07.473 11:26:36 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:07.473 11:26:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:07.473 11:26:36 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:07.473 11:26:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:07.473 11:26:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:07.473 11:26:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:07.473 11:26:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:07.473 11:26:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:07.473 11:26:36 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:07.473 11:26:36 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:07.473 11:26:36 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:07.473 11:26:36 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:07.473 11:26:36 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:07.473 11:26:36 -- setup/hugepages.sh@73 -- # return 0 00:03:07.473 11:26:36 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:07.473 11:26:36 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:07.473 11:26:36 -- setup/hugepages.sh@146 -- # setup output 00:03:07.473 11:26:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:07.473 11:26:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:10.763 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:10.763 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:11.047 11:26:40 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:11.047 11:26:40 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:11.047 11:26:40 -- setup/hugepages.sh@89 -- # local node 00:03:11.047 11:26:40 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:11.047 11:26:40 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:11.047 11:26:40 -- setup/hugepages.sh@92 -- # local surp 00:03:11.047 11:26:40 -- setup/hugepages.sh@93 -- # local resv 00:03:11.047 11:26:40 -- setup/hugepages.sh@94 -- # local anon 00:03:11.047 11:26:40 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:11.047 11:26:40 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:11.047 11:26:40 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:11.047 11:26:40 -- setup/common.sh@18 -- # local node= 00:03:11.047 11:26:40 -- setup/common.sh@19 -- # local var val 00:03:11.047 11:26:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.047 11:26:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.047 11:26:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.047 11:26:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.047 11:26:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.047 11:26:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37807860 kB' 'MemAvailable: 39740328 kB' 'Buffers: 2704 kB' 'Cached: 15987784 kB' 'SwapCached: 28 kB' 'Active: 15116896 kB' 'Inactive: 1501480 kB' 'Active(anon): 14586080 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631148 kB' 'Mapped: 213748 kB' 'Shmem: 13990404 kB' 'KReclaimable: 580080 kB' 'Slab: 1226752 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646672 kB' 'KernelStack: 21920 kB' 'PageTables: 8780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216452 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.047 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.047 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.048 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.048 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.049 11:26:40 -- setup/common.sh@33 -- # echo 0 00:03:11.049 11:26:40 -- setup/common.sh@33 -- # return 0 00:03:11.049 11:26:40 -- setup/hugepages.sh@97 -- # anon=0 00:03:11.049 11:26:40 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:11.049 11:26:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:11.049 11:26:40 -- setup/common.sh@18 -- # local node= 00:03:11.049 11:26:40 -- setup/common.sh@19 -- # local var val 00:03:11.049 11:26:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.049 11:26:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.049 11:26:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.049 11:26:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.049 11:26:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.049 11:26:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37807568 kB' 'MemAvailable: 39740036 kB' 'Buffers: 2704 kB' 'Cached: 15987784 kB' 'SwapCached: 28 kB' 'Active: 15116596 kB' 'Inactive: 1501480 kB' 'Active(anon): 14585780 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630912 kB' 'Mapped: 213744 kB' 'Shmem: 13990404 kB' 'KReclaimable: 580080 kB' 'Slab: 1226736 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646656 kB' 'KernelStack: 21920 kB' 'PageTables: 8792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.049 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.049 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.050 11:26:40 -- setup/common.sh@33 -- # echo 0 00:03:11.050 11:26:40 -- setup/common.sh@33 -- # return 0 00:03:11.050 11:26:40 -- setup/hugepages.sh@99 -- # surp=0 00:03:11.050 11:26:40 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:11.050 11:26:40 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:11.050 11:26:40 -- setup/common.sh@18 -- # local node= 00:03:11.050 11:26:40 -- setup/common.sh@19 -- # local var val 00:03:11.050 11:26:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.050 11:26:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.050 11:26:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.050 11:26:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.050 11:26:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.050 11:26:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37808072 kB' 'MemAvailable: 39740540 kB' 'Buffers: 2704 kB' 'Cached: 15987784 kB' 'SwapCached: 28 kB' 'Active: 15116636 kB' 'Inactive: 1501480 kB' 'Active(anon): 14585820 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630944 kB' 'Mapped: 213744 kB' 'Shmem: 13990404 kB' 'KReclaimable: 580080 kB' 'Slab: 1226736 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646656 kB' 'KernelStack: 21936 kB' 'PageTables: 8840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.050 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.050 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.051 11:26:40 -- setup/common.sh@33 -- # echo 0 00:03:11.051 11:26:40 -- setup/common.sh@33 -- # return 0 00:03:11.051 11:26:40 -- setup/hugepages.sh@100 -- # resv=0 00:03:11.051 11:26:40 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:11.051 nr_hugepages=1024 00:03:11.051 11:26:40 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:11.051 resv_hugepages=0 00:03:11.051 11:26:40 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:11.051 surplus_hugepages=0 00:03:11.051 11:26:40 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:11.051 anon_hugepages=0 00:03:11.051 11:26:40 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:11.051 11:26:40 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:11.051 11:26:40 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:11.051 11:26:40 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:11.051 11:26:40 -- setup/common.sh@18 -- # local node= 00:03:11.051 11:26:40 -- setup/common.sh@19 -- # local var val 00:03:11.051 11:26:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.051 11:26:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.051 11:26:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.051 11:26:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.051 11:26:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.051 11:26:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37808752 kB' 'MemAvailable: 39741220 kB' 'Buffers: 2704 kB' 'Cached: 15987820 kB' 'SwapCached: 28 kB' 'Active: 15116604 kB' 'Inactive: 1501480 kB' 'Active(anon): 14585788 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630848 kB' 'Mapped: 213744 kB' 'Shmem: 13990440 kB' 'KReclaimable: 580080 kB' 'Slab: 1226736 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646656 kB' 'KernelStack: 21904 kB' 'PageTables: 8740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026084 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.051 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.051 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.052 11:26:40 -- setup/common.sh@33 -- # echo 1024 00:03:11.052 11:26:40 -- setup/common.sh@33 -- # return 0 00:03:11.052 11:26:40 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:11.052 11:26:40 -- setup/hugepages.sh@112 -- # get_nodes 00:03:11.052 11:26:40 -- setup/hugepages.sh@27 -- # local node 00:03:11.052 11:26:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:11.052 11:26:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:11.052 11:26:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:11.052 11:26:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:11.052 11:26:40 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:11.052 11:26:40 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:11.052 11:26:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:11.052 11:26:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:11.052 11:26:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:11.052 11:26:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:11.052 11:26:40 -- setup/common.sh@18 -- # local node=0 00:03:11.052 11:26:40 -- setup/common.sh@19 -- # local var val 00:03:11.052 11:26:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.052 11:26:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.052 11:26:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:11.052 11:26:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:11.052 11:26:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.052 11:26:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23936488 kB' 'MemUsed: 8655596 kB' 'SwapCached: 16 kB' 'Active: 5868960 kB' 'Inactive: 349704 kB' 'Active(anon): 5613872 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833396 kB' 'Mapped: 102036 kB' 'AnonPages: 388412 kB' 'Shmem: 5228604 kB' 'KernelStack: 11896 kB' 'PageTables: 4708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 634308 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 287164 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.052 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.052 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.052 11:26:40 -- setup/common.sh@33 -- # echo 0 00:03:11.052 11:26:40 -- setup/common.sh@33 -- # return 0 00:03:11.052 11:26:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:11.053 11:26:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:11.053 11:26:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:11.053 11:26:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:11.053 11:26:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:11.053 11:26:40 -- setup/common.sh@18 -- # local node=1 00:03:11.053 11:26:40 -- setup/common.sh@19 -- # local var val 00:03:11.053 11:26:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.053 11:26:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.053 11:26:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:11.053 11:26:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:11.053 11:26:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.053 11:26:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 13872904 kB' 'MemUsed: 13830244 kB' 'SwapCached: 12 kB' 'Active: 9247992 kB' 'Inactive: 1151776 kB' 'Active(anon): 8972264 kB' 'Inactive(anon): 32196 kB' 'Active(file): 275728 kB' 'Inactive(file): 1119580 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10157172 kB' 'Mapped: 111708 kB' 'AnonPages: 242832 kB' 'Shmem: 8761852 kB' 'KernelStack: 10024 kB' 'PageTables: 4084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 232936 kB' 'Slab: 592428 kB' 'SReclaimable: 232936 kB' 'SUnreclaim: 359492 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # continue 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.053 11:26:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.053 11:26:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.053 11:26:40 -- setup/common.sh@33 -- # echo 0 00:03:11.053 11:26:40 -- setup/common.sh@33 -- # return 0 00:03:11.053 11:26:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:11.053 11:26:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:11.053 11:26:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:11.053 11:26:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:11.053 11:26:40 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:11.053 node0=512 expecting 512 00:03:11.053 11:26:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:11.053 11:26:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:11.053 11:26:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:11.053 11:26:40 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:11.053 node1=512 expecting 512 00:03:11.053 11:26:40 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:11.053 00:03:11.053 real 0m3.609s 00:03:11.053 user 0m1.368s 00:03:11.053 sys 0m2.311s 00:03:11.053 11:26:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:11.053 11:26:40 -- common/autotest_common.sh@10 -- # set +x 00:03:11.053 ************************************ 00:03:11.053 END TEST per_node_1G_alloc 00:03:11.053 ************************************ 00:03:11.312 11:26:40 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:11.312 11:26:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:11.312 11:26:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:11.312 11:26:40 -- common/autotest_common.sh@10 -- # set +x 00:03:11.312 ************************************ 00:03:11.312 START TEST even_2G_alloc 00:03:11.312 ************************************ 00:03:11.312 11:26:40 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:03:11.312 11:26:40 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:11.312 11:26:40 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:11.312 11:26:40 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:11.312 11:26:40 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:11.312 11:26:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:11.312 11:26:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:11.312 11:26:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:11.312 11:26:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:11.312 11:26:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:11.312 11:26:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:11.312 11:26:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:11.312 11:26:40 -- setup/hugepages.sh@83 -- # : 512 00:03:11.312 11:26:40 -- setup/hugepages.sh@84 -- # : 1 00:03:11.312 11:26:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:11.312 11:26:40 -- setup/hugepages.sh@83 -- # : 0 00:03:11.312 11:26:40 -- setup/hugepages.sh@84 -- # : 0 00:03:11.312 11:26:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:11.312 11:26:40 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:11.312 11:26:40 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:11.313 11:26:40 -- setup/hugepages.sh@153 -- # setup output 00:03:11.313 11:26:40 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:11.313 11:26:40 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:14.678 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:14.678 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:14.678 11:26:43 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:14.678 11:26:43 -- setup/hugepages.sh@89 -- # local node 00:03:14.678 11:26:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:14.678 11:26:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:14.678 11:26:43 -- setup/hugepages.sh@92 -- # local surp 00:03:14.678 11:26:43 -- setup/hugepages.sh@93 -- # local resv 00:03:14.678 11:26:43 -- setup/hugepages.sh@94 -- # local anon 00:03:14.678 11:26:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:14.678 11:26:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:14.678 11:26:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:14.678 11:26:43 -- setup/common.sh@18 -- # local node= 00:03:14.678 11:26:43 -- setup/common.sh@19 -- # local var val 00:03:14.678 11:26:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:14.678 11:26:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.678 11:26:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:14.678 11:26:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:14.678 11:26:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.678 11:26:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37823376 kB' 'MemAvailable: 39755844 kB' 'Buffers: 2704 kB' 'Cached: 15987912 kB' 'SwapCached: 28 kB' 'Active: 15116260 kB' 'Inactive: 1501480 kB' 'Active(anon): 14585444 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630236 kB' 'Mapped: 213804 kB' 'Shmem: 13990532 kB' 'KReclaimable: 580080 kB' 'Slab: 1226976 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646896 kB' 'KernelStack: 21920 kB' 'PageTables: 8808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.678 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.678 11:26:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:14.679 11:26:43 -- setup/common.sh@33 -- # echo 0 00:03:14.679 11:26:43 -- setup/common.sh@33 -- # return 0 00:03:14.679 11:26:43 -- setup/hugepages.sh@97 -- # anon=0 00:03:14.679 11:26:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:14.679 11:26:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:14.679 11:26:43 -- setup/common.sh@18 -- # local node= 00:03:14.679 11:26:43 -- setup/common.sh@19 -- # local var val 00:03:14.679 11:26:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:14.679 11:26:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.679 11:26:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:14.679 11:26:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:14.679 11:26:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.679 11:26:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37828176 kB' 'MemAvailable: 39760644 kB' 'Buffers: 2704 kB' 'Cached: 15987916 kB' 'SwapCached: 28 kB' 'Active: 15115952 kB' 'Inactive: 1501480 kB' 'Active(anon): 14585136 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 629996 kB' 'Mapped: 213748 kB' 'Shmem: 13990536 kB' 'KReclaimable: 580080 kB' 'Slab: 1226984 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646904 kB' 'KernelStack: 21920 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216388 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.679 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.679 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.680 11:26:43 -- setup/common.sh@33 -- # echo 0 00:03:14.680 11:26:43 -- setup/common.sh@33 -- # return 0 00:03:14.680 11:26:43 -- setup/hugepages.sh@99 -- # surp=0 00:03:14.680 11:26:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:14.680 11:26:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:14.680 11:26:43 -- setup/common.sh@18 -- # local node= 00:03:14.680 11:26:43 -- setup/common.sh@19 -- # local var val 00:03:14.680 11:26:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:14.680 11:26:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.680 11:26:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:14.680 11:26:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:14.680 11:26:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.680 11:26:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.680 11:26:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37828176 kB' 'MemAvailable: 39760644 kB' 'Buffers: 2704 kB' 'Cached: 15987916 kB' 'SwapCached: 28 kB' 'Active: 15115988 kB' 'Inactive: 1501480 kB' 'Active(anon): 14585172 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 630024 kB' 'Mapped: 213748 kB' 'Shmem: 13990536 kB' 'KReclaimable: 580080 kB' 'Slab: 1226984 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646904 kB' 'KernelStack: 21936 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216388 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.680 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.680 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:14.681 11:26:43 -- setup/common.sh@33 -- # echo 0 00:03:14.681 11:26:43 -- setup/common.sh@33 -- # return 0 00:03:14.681 11:26:43 -- setup/hugepages.sh@100 -- # resv=0 00:03:14.681 11:26:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:14.681 nr_hugepages=1024 00:03:14.681 11:26:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:14.681 resv_hugepages=0 00:03:14.681 11:26:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:14.681 surplus_hugepages=0 00:03:14.681 11:26:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:14.681 anon_hugepages=0 00:03:14.681 11:26:43 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:14.681 11:26:43 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:14.681 11:26:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:14.681 11:26:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:14.681 11:26:43 -- setup/common.sh@18 -- # local node= 00:03:14.681 11:26:43 -- setup/common.sh@19 -- # local var val 00:03:14.681 11:26:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:14.681 11:26:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.681 11:26:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:14.681 11:26:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:14.681 11:26:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.681 11:26:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37828428 kB' 'MemAvailable: 39760896 kB' 'Buffers: 2704 kB' 'Cached: 15987936 kB' 'SwapCached: 28 kB' 'Active: 15115496 kB' 'Inactive: 1501480 kB' 'Active(anon): 14584680 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 629460 kB' 'Mapped: 213748 kB' 'Shmem: 13990556 kB' 'KReclaimable: 580080 kB' 'Slab: 1226984 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646904 kB' 'KernelStack: 21904 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16026368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216388 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.681 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.681 11:26:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.682 11:26:43 -- setup/common.sh@33 -- # echo 1024 00:03:14.682 11:26:43 -- setup/common.sh@33 -- # return 0 00:03:14.682 11:26:43 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:14.682 11:26:43 -- setup/hugepages.sh@112 -- # get_nodes 00:03:14.682 11:26:43 -- setup/hugepages.sh@27 -- # local node 00:03:14.682 11:26:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:14.682 11:26:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:14.682 11:26:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:14.682 11:26:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:14.682 11:26:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:14.682 11:26:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:14.682 11:26:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:14.682 11:26:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:14.682 11:26:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:14.682 11:26:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:14.682 11:26:43 -- setup/common.sh@18 -- # local node=0 00:03:14.682 11:26:43 -- setup/common.sh@19 -- # local var val 00:03:14.682 11:26:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:14.682 11:26:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.682 11:26:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:14.682 11:26:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:14.682 11:26:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.682 11:26:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.682 11:26:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23951844 kB' 'MemUsed: 8640240 kB' 'SwapCached: 16 kB' 'Active: 5868960 kB' 'Inactive: 349704 kB' 'Active(anon): 5613872 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833456 kB' 'Mapped: 102540 kB' 'AnonPages: 388284 kB' 'Shmem: 5228664 kB' 'KernelStack: 11912 kB' 'PageTables: 4676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 634496 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 287352 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.682 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.682 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@33 -- # echo 0 00:03:14.683 11:26:43 -- setup/common.sh@33 -- # return 0 00:03:14.683 11:26:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:14.683 11:26:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:14.683 11:26:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:14.683 11:26:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:14.683 11:26:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:14.683 11:26:43 -- setup/common.sh@18 -- # local node=1 00:03:14.683 11:26:43 -- setup/common.sh@19 -- # local var val 00:03:14.683 11:26:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:14.683 11:26:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.683 11:26:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:14.683 11:26:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:14.683 11:26:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.683 11:26:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 13874052 kB' 'MemUsed: 13829096 kB' 'SwapCached: 12 kB' 'Active: 9246756 kB' 'Inactive: 1151776 kB' 'Active(anon): 8971028 kB' 'Inactive(anon): 32196 kB' 'Active(file): 275728 kB' 'Inactive(file): 1119580 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10157244 kB' 'Mapped: 111872 kB' 'AnonPages: 241440 kB' 'Shmem: 8761924 kB' 'KernelStack: 10008 kB' 'PageTables: 4156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 232936 kB' 'Slab: 592488 kB' 'SReclaimable: 232936 kB' 'SUnreclaim: 359552 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.683 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.683 11:26:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # continue 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:14.684 11:26:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:14.684 11:26:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.684 11:26:43 -- setup/common.sh@33 -- # echo 0 00:03:14.684 11:26:43 -- setup/common.sh@33 -- # return 0 00:03:14.684 11:26:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:14.684 11:26:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:14.684 11:26:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:14.684 11:26:43 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:14.684 node0=512 expecting 512 00:03:14.684 11:26:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:14.684 11:26:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:14.684 11:26:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:14.684 11:26:43 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:14.684 node1=512 expecting 512 00:03:14.684 11:26:43 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:14.684 00:03:14.684 real 0m3.418s 00:03:14.684 user 0m1.193s 00:03:14.684 sys 0m2.261s 00:03:14.684 11:26:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:14.684 11:26:43 -- common/autotest_common.sh@10 -- # set +x 00:03:14.684 ************************************ 00:03:14.684 END TEST even_2G_alloc 00:03:14.684 ************************************ 00:03:14.684 11:26:43 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:14.684 11:26:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:14.684 11:26:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:14.684 11:26:43 -- common/autotest_common.sh@10 -- # set +x 00:03:14.684 ************************************ 00:03:14.684 START TEST odd_alloc 00:03:14.684 ************************************ 00:03:14.684 11:26:43 -- common/autotest_common.sh@1104 -- # odd_alloc 00:03:14.684 11:26:43 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:14.684 11:26:43 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:14.684 11:26:43 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:14.684 11:26:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:14.684 11:26:43 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:14.684 11:26:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:14.684 11:26:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:14.684 11:26:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:14.684 11:26:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:14.684 11:26:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:14.684 11:26:43 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:14.684 11:26:43 -- setup/hugepages.sh@83 -- # : 513 00:03:14.684 11:26:43 -- setup/hugepages.sh@84 -- # : 1 00:03:14.684 11:26:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:14.684 11:26:43 -- setup/hugepages.sh@83 -- # : 0 00:03:14.684 11:26:43 -- setup/hugepages.sh@84 -- # : 0 00:03:14.684 11:26:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:14.684 11:26:43 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:14.684 11:26:43 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:14.684 11:26:43 -- setup/hugepages.sh@160 -- # setup output 00:03:14.684 11:26:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:14.684 11:26:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:17.970 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:17.970 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:17.970 11:26:47 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:17.970 11:26:47 -- setup/hugepages.sh@89 -- # local node 00:03:17.970 11:26:47 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:17.970 11:26:47 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:17.970 11:26:47 -- setup/hugepages.sh@92 -- # local surp 00:03:17.970 11:26:47 -- setup/hugepages.sh@93 -- # local resv 00:03:17.970 11:26:47 -- setup/hugepages.sh@94 -- # local anon 00:03:17.970 11:26:47 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:17.970 11:26:47 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:17.970 11:26:47 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:17.970 11:26:47 -- setup/common.sh@18 -- # local node= 00:03:17.970 11:26:47 -- setup/common.sh@19 -- # local var val 00:03:17.970 11:26:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.970 11:26:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.970 11:26:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.970 11:26:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.970 11:26:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.970 11:26:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37852240 kB' 'MemAvailable: 39784708 kB' 'Buffers: 2704 kB' 'Cached: 15988040 kB' 'SwapCached: 28 kB' 'Active: 15123580 kB' 'Inactive: 1501480 kB' 'Active(anon): 14592764 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 637648 kB' 'Mapped: 214752 kB' 'Shmem: 13990660 kB' 'KReclaimable: 580080 kB' 'Slab: 1227820 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 647740 kB' 'KernelStack: 21984 kB' 'PageTables: 9076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 16036908 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216468 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.970 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.970 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.971 11:26:47 -- setup/common.sh@33 -- # echo 0 00:03:17.971 11:26:47 -- setup/common.sh@33 -- # return 0 00:03:17.971 11:26:47 -- setup/hugepages.sh@97 -- # anon=0 00:03:17.971 11:26:47 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:17.971 11:26:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:17.971 11:26:47 -- setup/common.sh@18 -- # local node= 00:03:17.971 11:26:47 -- setup/common.sh@19 -- # local var val 00:03:17.971 11:26:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.971 11:26:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.971 11:26:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.971 11:26:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.971 11:26:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.971 11:26:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37854212 kB' 'MemAvailable: 39786680 kB' 'Buffers: 2704 kB' 'Cached: 15988044 kB' 'SwapCached: 28 kB' 'Active: 15118992 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588176 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633152 kB' 'Mapped: 214280 kB' 'Shmem: 13990664 kB' 'KReclaimable: 580080 kB' 'Slab: 1227816 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 647736 kB' 'KernelStack: 22048 kB' 'PageTables: 8916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 16044940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.971 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.971 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.972 11:26:47 -- setup/common.sh@33 -- # echo 0 00:03:17.972 11:26:47 -- setup/common.sh@33 -- # return 0 00:03:17.972 11:26:47 -- setup/hugepages.sh@99 -- # surp=0 00:03:17.972 11:26:47 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:17.972 11:26:47 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:17.972 11:26:47 -- setup/common.sh@18 -- # local node= 00:03:17.972 11:26:47 -- setup/common.sh@19 -- # local var val 00:03:17.972 11:26:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:17.972 11:26:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.972 11:26:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.972 11:26:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.972 11:26:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.972 11:26:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37847676 kB' 'MemAvailable: 39780144 kB' 'Buffers: 2704 kB' 'Cached: 15988060 kB' 'SwapCached: 28 kB' 'Active: 15122444 kB' 'Inactive: 1501480 kB' 'Active(anon): 14591628 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 636520 kB' 'Mapped: 214672 kB' 'Shmem: 13990680 kB' 'KReclaimable: 580080 kB' 'Slab: 1227744 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 647664 kB' 'KernelStack: 22000 kB' 'PageTables: 8836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 16037840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216456 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.972 11:26:47 -- setup/common.sh@32 -- # continue 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:17.972 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.233 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.233 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:18.234 11:26:47 -- setup/common.sh@33 -- # echo 0 00:03:18.234 11:26:47 -- setup/common.sh@33 -- # return 0 00:03:18.234 11:26:47 -- setup/hugepages.sh@100 -- # resv=0 00:03:18.234 11:26:47 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:18.234 nr_hugepages=1025 00:03:18.234 11:26:47 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:18.234 resv_hugepages=0 00:03:18.234 11:26:47 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:18.234 surplus_hugepages=0 00:03:18.234 11:26:47 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:18.234 anon_hugepages=0 00:03:18.234 11:26:47 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:18.234 11:26:47 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:18.234 11:26:47 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:18.234 11:26:47 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:18.234 11:26:47 -- setup/common.sh@18 -- # local node= 00:03:18.234 11:26:47 -- setup/common.sh@19 -- # local var val 00:03:18.234 11:26:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:18.234 11:26:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:18.234 11:26:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:18.234 11:26:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:18.234 11:26:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:18.234 11:26:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37845800 kB' 'MemAvailable: 39778268 kB' 'Buffers: 2704 kB' 'Cached: 15988064 kB' 'SwapCached: 28 kB' 'Active: 15117200 kB' 'Inactive: 1501480 kB' 'Active(anon): 14586384 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631292 kB' 'Mapped: 214168 kB' 'Shmem: 13990684 kB' 'KReclaimable: 580080 kB' 'Slab: 1227744 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 647664 kB' 'KernelStack: 22032 kB' 'PageTables: 9072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 16031736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.234 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.234 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:18.235 11:26:47 -- setup/common.sh@33 -- # echo 1025 00:03:18.235 11:26:47 -- setup/common.sh@33 -- # return 0 00:03:18.235 11:26:47 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:18.235 11:26:47 -- setup/hugepages.sh@112 -- # get_nodes 00:03:18.235 11:26:47 -- setup/hugepages.sh@27 -- # local node 00:03:18.235 11:26:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:18.235 11:26:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:18.235 11:26:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:18.235 11:26:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:18.235 11:26:47 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:18.235 11:26:47 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:18.235 11:26:47 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:18.235 11:26:47 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:18.235 11:26:47 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:18.235 11:26:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:18.235 11:26:47 -- setup/common.sh@18 -- # local node=0 00:03:18.235 11:26:47 -- setup/common.sh@19 -- # local var val 00:03:18.235 11:26:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:18.235 11:26:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:18.235 11:26:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:18.235 11:26:47 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:18.235 11:26:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:18.235 11:26:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23972376 kB' 'MemUsed: 8619708 kB' 'SwapCached: 16 kB' 'Active: 5869716 kB' 'Inactive: 349704 kB' 'Active(anon): 5614628 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833528 kB' 'Mapped: 102036 kB' 'AnonPages: 389092 kB' 'Shmem: 5228736 kB' 'KernelStack: 11880 kB' 'PageTables: 4644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 634992 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 287848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.235 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.235 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@33 -- # echo 0 00:03:18.236 11:26:47 -- setup/common.sh@33 -- # return 0 00:03:18.236 11:26:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:18.236 11:26:47 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:18.236 11:26:47 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:18.236 11:26:47 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:18.236 11:26:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:18.236 11:26:47 -- setup/common.sh@18 -- # local node=1 00:03:18.236 11:26:47 -- setup/common.sh@19 -- # local var val 00:03:18.236 11:26:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:18.236 11:26:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:18.236 11:26:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:18.236 11:26:47 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:18.236 11:26:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:18.236 11:26:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 13870004 kB' 'MemUsed: 13833144 kB' 'SwapCached: 12 kB' 'Active: 9247472 kB' 'Inactive: 1151776 kB' 'Active(anon): 8971744 kB' 'Inactive(anon): 32196 kB' 'Active(file): 275728 kB' 'Inactive(file): 1119580 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10157312 kB' 'Mapped: 111720 kB' 'AnonPages: 242080 kB' 'Shmem: 8761992 kB' 'KernelStack: 10216 kB' 'PageTables: 4328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 232936 kB' 'Slab: 592752 kB' 'SReclaimable: 232936 kB' 'SUnreclaim: 359816 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.236 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.236 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # continue 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:18.237 11:26:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:18.237 11:26:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:18.237 11:26:47 -- setup/common.sh@33 -- # echo 0 00:03:18.237 11:26:47 -- setup/common.sh@33 -- # return 0 00:03:18.237 11:26:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:18.237 11:26:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:18.237 11:26:47 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:18.237 node0=512 expecting 513 00:03:18.237 11:26:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:18.237 11:26:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:18.237 11:26:47 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:18.237 node1=513 expecting 512 00:03:18.237 11:26:47 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:18.237 00:03:18.237 real 0m3.549s 00:03:18.237 user 0m1.302s 00:03:18.237 sys 0m2.287s 00:03:18.237 11:26:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:18.237 11:26:47 -- common/autotest_common.sh@10 -- # set +x 00:03:18.237 ************************************ 00:03:18.237 END TEST odd_alloc 00:03:18.237 ************************************ 00:03:18.237 11:26:47 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:18.237 11:26:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:18.237 11:26:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:18.237 11:26:47 -- common/autotest_common.sh@10 -- # set +x 00:03:18.237 ************************************ 00:03:18.237 START TEST custom_alloc 00:03:18.237 ************************************ 00:03:18.237 11:26:47 -- common/autotest_common.sh@1104 -- # custom_alloc 00:03:18.237 11:26:47 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:18.237 11:26:47 -- setup/hugepages.sh@169 -- # local node 00:03:18.237 11:26:47 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:18.237 11:26:47 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:18.237 11:26:47 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:18.237 11:26:47 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:18.237 11:26:47 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:18.237 11:26:47 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:18.237 11:26:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:18.237 11:26:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:18.237 11:26:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:18.237 11:26:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:18.237 11:26:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:18.237 11:26:47 -- setup/hugepages.sh@83 -- # : 256 00:03:18.237 11:26:47 -- setup/hugepages.sh@84 -- # : 1 00:03:18.237 11:26:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:18.237 11:26:47 -- setup/hugepages.sh@83 -- # : 0 00:03:18.237 11:26:47 -- setup/hugepages.sh@84 -- # : 0 00:03:18.237 11:26:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:18.237 11:26:47 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:18.237 11:26:47 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:18.237 11:26:47 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:18.237 11:26:47 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:18.237 11:26:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:18.237 11:26:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:18.237 11:26:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:18.237 11:26:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:18.237 11:26:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:18.237 11:26:47 -- setup/hugepages.sh@78 -- # return 0 00:03:18.237 11:26:47 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:18.237 11:26:47 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:18.237 11:26:47 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:18.237 11:26:47 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:18.237 11:26:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:18.237 11:26:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:18.237 11:26:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:18.237 11:26:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:18.237 11:26:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:18.237 11:26:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:18.237 11:26:47 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:18.237 11:26:47 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:18.237 11:26:47 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:18.237 11:26:47 -- setup/hugepages.sh@78 -- # return 0 00:03:18.238 11:26:47 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:18.238 11:26:47 -- setup/hugepages.sh@187 -- # setup output 00:03:18.238 11:26:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:18.238 11:26:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:21.540 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:21.540 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:21.801 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:21.801 11:26:51 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:21.801 11:26:51 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:21.801 11:26:51 -- setup/hugepages.sh@89 -- # local node 00:03:21.801 11:26:51 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:21.801 11:26:51 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:21.801 11:26:51 -- setup/hugepages.sh@92 -- # local surp 00:03:21.801 11:26:51 -- setup/hugepages.sh@93 -- # local resv 00:03:21.801 11:26:51 -- setup/hugepages.sh@94 -- # local anon 00:03:21.801 11:26:51 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:21.801 11:26:51 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:21.801 11:26:51 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:21.801 11:26:51 -- setup/common.sh@18 -- # local node= 00:03:21.801 11:26:51 -- setup/common.sh@19 -- # local var val 00:03:21.801 11:26:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.801 11:26:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.801 11:26:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.801 11:26:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.801 11:26:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.801 11:26:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 36808640 kB' 'MemAvailable: 38741612 kB' 'Buffers: 2704 kB' 'Cached: 15988196 kB' 'SwapCached: 28 kB' 'Active: 15117688 kB' 'Inactive: 1501480 kB' 'Active(anon): 14586872 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631672 kB' 'Mapped: 213776 kB' 'Shmem: 13990816 kB' 'KReclaimable: 580080 kB' 'Slab: 1228304 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 648224 kB' 'KernelStack: 21936 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 16028576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.801 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.801 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.802 11:26:51 -- setup/common.sh@33 -- # echo 0 00:03:21.802 11:26:51 -- setup/common.sh@33 -- # return 0 00:03:21.802 11:26:51 -- setup/hugepages.sh@97 -- # anon=0 00:03:21.802 11:26:51 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:21.802 11:26:51 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:21.802 11:26:51 -- setup/common.sh@18 -- # local node= 00:03:21.802 11:26:51 -- setup/common.sh@19 -- # local var val 00:03:21.802 11:26:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.802 11:26:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.802 11:26:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.802 11:26:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.802 11:26:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.802 11:26:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 36809776 kB' 'MemAvailable: 38742244 kB' 'Buffers: 2704 kB' 'Cached: 15988200 kB' 'SwapCached: 28 kB' 'Active: 15117784 kB' 'Inactive: 1501480 kB' 'Active(anon): 14586968 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631748 kB' 'Mapped: 213760 kB' 'Shmem: 13990820 kB' 'KReclaimable: 580080 kB' 'Slab: 1228348 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 648268 kB' 'KernelStack: 21920 kB' 'PageTables: 8800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 16028588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.802 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.802 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.803 11:26:51 -- setup/common.sh@33 -- # echo 0 00:03:21.803 11:26:51 -- setup/common.sh@33 -- # return 0 00:03:21.803 11:26:51 -- setup/hugepages.sh@99 -- # surp=0 00:03:21.803 11:26:51 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:21.803 11:26:51 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:21.803 11:26:51 -- setup/common.sh@18 -- # local node= 00:03:21.803 11:26:51 -- setup/common.sh@19 -- # local var val 00:03:21.803 11:26:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.803 11:26:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.803 11:26:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.803 11:26:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.803 11:26:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.803 11:26:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 36809272 kB' 'MemAvailable: 38741740 kB' 'Buffers: 2704 kB' 'Cached: 15988200 kB' 'SwapCached: 28 kB' 'Active: 15117824 kB' 'Inactive: 1501480 kB' 'Active(anon): 14587008 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631780 kB' 'Mapped: 213760 kB' 'Shmem: 13990820 kB' 'KReclaimable: 580080 kB' 'Slab: 1228348 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 648268 kB' 'KernelStack: 21936 kB' 'PageTables: 8848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 16028604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216532 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.803 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.803 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.804 11:26:51 -- setup/common.sh@33 -- # echo 0 00:03:21.804 11:26:51 -- setup/common.sh@33 -- # return 0 00:03:21.804 11:26:51 -- setup/hugepages.sh@100 -- # resv=0 00:03:21.804 11:26:51 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:21.804 nr_hugepages=1536 00:03:21.804 11:26:51 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:21.804 resv_hugepages=0 00:03:21.804 11:26:51 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:21.804 surplus_hugepages=0 00:03:21.804 11:26:51 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:21.804 anon_hugepages=0 00:03:21.804 11:26:51 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:21.804 11:26:51 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:21.804 11:26:51 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:21.804 11:26:51 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:21.804 11:26:51 -- setup/common.sh@18 -- # local node= 00:03:21.804 11:26:51 -- setup/common.sh@19 -- # local var val 00:03:21.804 11:26:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.804 11:26:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.804 11:26:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.804 11:26:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.804 11:26:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.804 11:26:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.804 11:26:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 36809352 kB' 'MemAvailable: 38741820 kB' 'Buffers: 2704 kB' 'Cached: 15988224 kB' 'SwapCached: 28 kB' 'Active: 15117932 kB' 'Inactive: 1501480 kB' 'Active(anon): 14587116 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 631884 kB' 'Mapped: 213760 kB' 'Shmem: 13990844 kB' 'KReclaimable: 580080 kB' 'Slab: 1228348 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 648268 kB' 'KernelStack: 21920 kB' 'PageTables: 8804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 16028616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216516 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.804 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.804 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.805 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.805 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.066 11:26:51 -- setup/common.sh@33 -- # echo 1536 00:03:22.066 11:26:51 -- setup/common.sh@33 -- # return 0 00:03:22.066 11:26:51 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:22.066 11:26:51 -- setup/hugepages.sh@112 -- # get_nodes 00:03:22.066 11:26:51 -- setup/hugepages.sh@27 -- # local node 00:03:22.066 11:26:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:22.066 11:26:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:22.066 11:26:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:22.066 11:26:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:22.066 11:26:51 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:22.066 11:26:51 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:22.066 11:26:51 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:22.066 11:26:51 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:22.066 11:26:51 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:22.066 11:26:51 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:22.066 11:26:51 -- setup/common.sh@18 -- # local node=0 00:03:22.066 11:26:51 -- setup/common.sh@19 -- # local var val 00:03:22.066 11:26:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.066 11:26:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.066 11:26:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:22.066 11:26:51 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:22.066 11:26:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.066 11:26:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23972196 kB' 'MemUsed: 8619888 kB' 'SwapCached: 16 kB' 'Active: 5870672 kB' 'Inactive: 349704 kB' 'Active(anon): 5615584 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833564 kB' 'Mapped: 102036 kB' 'AnonPages: 390056 kB' 'Shmem: 5228772 kB' 'KernelStack: 11928 kB' 'PageTables: 4716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 635512 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 288368 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@33 -- # echo 0 00:03:22.066 11:26:51 -- setup/common.sh@33 -- # return 0 00:03:22.066 11:26:51 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:22.066 11:26:51 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:22.066 11:26:51 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:22.066 11:26:51 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:22.066 11:26:51 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:22.066 11:26:51 -- setup/common.sh@18 -- # local node=1 00:03:22.066 11:26:51 -- setup/common.sh@19 -- # local var val 00:03:22.066 11:26:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.066 11:26:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.066 11:26:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:22.066 11:26:51 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:22.066 11:26:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.066 11:26:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 12837704 kB' 'MemUsed: 14865444 kB' 'SwapCached: 12 kB' 'Active: 9247164 kB' 'Inactive: 1151776 kB' 'Active(anon): 8971436 kB' 'Inactive(anon): 32196 kB' 'Active(file): 275728 kB' 'Inactive(file): 1119580 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10157420 kB' 'Mapped: 111724 kB' 'AnonPages: 241692 kB' 'Shmem: 8762100 kB' 'KernelStack: 9976 kB' 'PageTables: 4036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 232936 kB' 'Slab: 592836 kB' 'SReclaimable: 232936 kB' 'SUnreclaim: 359900 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.066 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.066 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # continue 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.067 11:26:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.067 11:26:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.067 11:26:51 -- setup/common.sh@33 -- # echo 0 00:03:22.067 11:26:51 -- setup/common.sh@33 -- # return 0 00:03:22.067 11:26:51 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:22.067 11:26:51 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:22.067 11:26:51 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:22.067 11:26:51 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:22.067 11:26:51 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:22.067 node0=512 expecting 512 00:03:22.067 11:26:51 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:22.067 11:26:51 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:22.067 11:26:51 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:22.067 11:26:51 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:22.067 node1=1024 expecting 1024 00:03:22.067 11:26:51 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:22.067 00:03:22.067 real 0m3.729s 00:03:22.067 user 0m1.348s 00:03:22.067 sys 0m2.450s 00:03:22.067 11:26:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:22.067 11:26:51 -- common/autotest_common.sh@10 -- # set +x 00:03:22.067 ************************************ 00:03:22.067 END TEST custom_alloc 00:03:22.067 ************************************ 00:03:22.067 11:26:51 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:22.067 11:26:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:22.067 11:26:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:22.067 11:26:51 -- common/autotest_common.sh@10 -- # set +x 00:03:22.067 ************************************ 00:03:22.067 START TEST no_shrink_alloc 00:03:22.067 ************************************ 00:03:22.067 11:26:51 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:03:22.067 11:26:51 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:22.067 11:26:51 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:22.067 11:26:51 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:22.067 11:26:51 -- setup/hugepages.sh@51 -- # shift 00:03:22.067 11:26:51 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:22.067 11:26:51 -- setup/hugepages.sh@52 -- # local node_ids 00:03:22.067 11:26:51 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:22.067 11:26:51 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:22.067 11:26:51 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:22.067 11:26:51 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:22.067 11:26:51 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:22.067 11:26:51 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:22.067 11:26:51 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:22.067 11:26:51 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:22.067 11:26:51 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:22.067 11:26:51 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:22.067 11:26:51 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:22.067 11:26:51 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:22.067 11:26:51 -- setup/hugepages.sh@73 -- # return 0 00:03:22.067 11:26:51 -- setup/hugepages.sh@198 -- # setup output 00:03:22.067 11:26:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:22.067 11:26:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:25.373 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:25.373 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:25.374 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:25.374 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:25.374 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:25.374 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:25.374 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:25.374 11:26:54 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:25.374 11:26:54 -- setup/hugepages.sh@89 -- # local node 00:03:25.374 11:26:54 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:25.374 11:26:54 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:25.374 11:26:54 -- setup/hugepages.sh@92 -- # local surp 00:03:25.374 11:26:54 -- setup/hugepages.sh@93 -- # local resv 00:03:25.374 11:26:54 -- setup/hugepages.sh@94 -- # local anon 00:03:25.374 11:26:54 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:25.374 11:26:54 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:25.374 11:26:54 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:25.374 11:26:54 -- setup/common.sh@18 -- # local node= 00:03:25.374 11:26:54 -- setup/common.sh@19 -- # local var val 00:03:25.374 11:26:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.374 11:26:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.374 11:26:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.374 11:26:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.374 11:26:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.374 11:26:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37887432 kB' 'MemAvailable: 39819900 kB' 'Buffers: 2704 kB' 'Cached: 15988320 kB' 'SwapCached: 28 kB' 'Active: 15119096 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588280 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632928 kB' 'Mapped: 213824 kB' 'Shmem: 13990940 kB' 'KReclaimable: 580080 kB' 'Slab: 1227056 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646976 kB' 'KernelStack: 21952 kB' 'PageTables: 8904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.374 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.374 11:26:54 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.375 11:26:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:25.375 11:26:54 -- setup/common.sh@33 -- # echo 0 00:03:25.375 11:26:54 -- setup/common.sh@33 -- # return 0 00:03:25.375 11:26:54 -- setup/hugepages.sh@97 -- # anon=0 00:03:25.375 11:26:54 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:25.375 11:26:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:25.375 11:26:54 -- setup/common.sh@18 -- # local node= 00:03:25.375 11:26:54 -- setup/common.sh@19 -- # local var val 00:03:25.375 11:26:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.375 11:26:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.375 11:26:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.375 11:26:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.375 11:26:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.375 11:26:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.375 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37887164 kB' 'MemAvailable: 39819632 kB' 'Buffers: 2704 kB' 'Cached: 15988324 kB' 'SwapCached: 28 kB' 'Active: 15118672 kB' 'Inactive: 1501480 kB' 'Active(anon): 14587856 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632468 kB' 'Mapped: 213764 kB' 'Shmem: 13990944 kB' 'KReclaimable: 580080 kB' 'Slab: 1227076 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646996 kB' 'KernelStack: 21920 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.376 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.376 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.377 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.377 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.377 11:26:54 -- setup/common.sh@33 -- # echo 0 00:03:25.377 11:26:54 -- setup/common.sh@33 -- # return 0 00:03:25.377 11:26:54 -- setup/hugepages.sh@99 -- # surp=0 00:03:25.377 11:26:54 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:25.377 11:26:54 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:25.377 11:26:54 -- setup/common.sh@18 -- # local node= 00:03:25.377 11:26:54 -- setup/common.sh@19 -- # local var val 00:03:25.377 11:26:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.377 11:26:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.377 11:26:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.377 11:26:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.378 11:26:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.378 11:26:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37886104 kB' 'MemAvailable: 39818572 kB' 'Buffers: 2704 kB' 'Cached: 15988336 kB' 'SwapCached: 28 kB' 'Active: 15118720 kB' 'Inactive: 1501480 kB' 'Active(anon): 14587904 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632480 kB' 'Mapped: 213764 kB' 'Shmem: 13990956 kB' 'KReclaimable: 580080 kB' 'Slab: 1227076 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646996 kB' 'KernelStack: 21920 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.378 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.378 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:25.379 11:26:54 -- setup/common.sh@33 -- # echo 0 00:03:25.379 11:26:54 -- setup/common.sh@33 -- # return 0 00:03:25.379 11:26:54 -- setup/hugepages.sh@100 -- # resv=0 00:03:25.379 11:26:54 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:25.379 nr_hugepages=1024 00:03:25.379 11:26:54 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:25.379 resv_hugepages=0 00:03:25.379 11:26:54 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:25.379 surplus_hugepages=0 00:03:25.379 11:26:54 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:25.379 anon_hugepages=0 00:03:25.379 11:26:54 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:25.379 11:26:54 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:25.379 11:26:54 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:25.379 11:26:54 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:25.379 11:26:54 -- setup/common.sh@18 -- # local node= 00:03:25.379 11:26:54 -- setup/common.sh@19 -- # local var val 00:03:25.379 11:26:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.379 11:26:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.379 11:26:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:25.379 11:26:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:25.379 11:26:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.379 11:26:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37885852 kB' 'MemAvailable: 39818320 kB' 'Buffers: 2704 kB' 'Cached: 15988348 kB' 'SwapCached: 28 kB' 'Active: 15119032 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588216 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 632788 kB' 'Mapped: 213764 kB' 'Shmem: 13990968 kB' 'KReclaimable: 580080 kB' 'Slab: 1227076 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646996 kB' 'KernelStack: 21920 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.379 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.379 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.380 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.380 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:25.380 11:26:54 -- setup/common.sh@33 -- # echo 1024 00:03:25.380 11:26:54 -- setup/common.sh@33 -- # return 0 00:03:25.380 11:26:54 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:25.380 11:26:54 -- setup/hugepages.sh@112 -- # get_nodes 00:03:25.380 11:26:54 -- setup/hugepages.sh@27 -- # local node 00:03:25.381 11:26:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:25.381 11:26:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:25.381 11:26:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:25.381 11:26:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:25.381 11:26:54 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:25.381 11:26:54 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:25.381 11:26:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:25.381 11:26:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:25.381 11:26:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:25.381 11:26:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:25.381 11:26:54 -- setup/common.sh@18 -- # local node=0 00:03:25.381 11:26:54 -- setup/common.sh@19 -- # local var val 00:03:25.381 11:26:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:25.381 11:26:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:25.381 11:26:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:25.381 11:26:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:25.381 11:26:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:25.381 11:26:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 22952576 kB' 'MemUsed: 9639508 kB' 'SwapCached: 16 kB' 'Active: 5870292 kB' 'Inactive: 349704 kB' 'Active(anon): 5615204 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833568 kB' 'Mapped: 102036 kB' 'AnonPages: 389532 kB' 'Shmem: 5228776 kB' 'KernelStack: 11896 kB' 'PageTables: 4628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 634536 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 287392 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.381 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.381 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.382 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.382 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.382 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.382 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.382 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.382 11:26:54 -- setup/common.sh@32 -- # continue 00:03:25.382 11:26:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:25.382 11:26:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:25.382 11:26:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:25.382 11:26:54 -- setup/common.sh@33 -- # echo 0 00:03:25.382 11:26:54 -- setup/common.sh@33 -- # return 0 00:03:25.382 11:26:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:25.382 11:26:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:25.382 11:26:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:25.382 11:26:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:25.382 11:26:54 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:25.382 node0=1024 expecting 1024 00:03:25.382 11:26:54 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:25.382 11:26:54 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:25.382 11:26:54 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:25.382 11:26:54 -- setup/hugepages.sh@202 -- # setup output 00:03:25.382 11:26:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.382 11:26:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:28.670 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:28.670 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:28.670 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:28.670 11:26:57 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:28.670 11:26:57 -- setup/hugepages.sh@89 -- # local node 00:03:28.670 11:26:57 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:28.670 11:26:57 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:28.670 11:26:57 -- setup/hugepages.sh@92 -- # local surp 00:03:28.670 11:26:57 -- setup/hugepages.sh@93 -- # local resv 00:03:28.670 11:26:57 -- setup/hugepages.sh@94 -- # local anon 00:03:28.670 11:26:57 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:28.670 11:26:57 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:28.670 11:26:57 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:28.670 11:26:57 -- setup/common.sh@18 -- # local node= 00:03:28.670 11:26:57 -- setup/common.sh@19 -- # local var val 00:03:28.670 11:26:57 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.670 11:26:57 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.670 11:26:57 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.670 11:26:57 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.670 11:26:57 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.670 11:26:57 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37917684 kB' 'MemAvailable: 39850152 kB' 'Buffers: 2704 kB' 'Cached: 15988436 kB' 'SwapCached: 28 kB' 'Active: 15119516 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588700 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633116 kB' 'Mapped: 213772 kB' 'Shmem: 13991056 kB' 'KReclaimable: 580080 kB' 'Slab: 1226792 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646712 kB' 'KernelStack: 21936 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029860 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216436 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.670 11:26:57 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.670 11:26:57 -- setup/common.sh@33 -- # echo 0 00:03:28.670 11:26:57 -- setup/common.sh@33 -- # return 0 00:03:28.670 11:26:57 -- setup/hugepages.sh@97 -- # anon=0 00:03:28.670 11:26:57 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:28.670 11:26:57 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.670 11:26:57 -- setup/common.sh@18 -- # local node= 00:03:28.670 11:26:57 -- setup/common.sh@19 -- # local var val 00:03:28.670 11:26:57 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.670 11:26:57 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.670 11:26:57 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.670 11:26:57 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.670 11:26:57 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.670 11:26:57 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.670 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37918032 kB' 'MemAvailable: 39850500 kB' 'Buffers: 2704 kB' 'Cached: 15988440 kB' 'SwapCached: 28 kB' 'Active: 15119424 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588608 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633004 kB' 'Mapped: 213772 kB' 'Shmem: 13991060 kB' 'KReclaimable: 580080 kB' 'Slab: 1226908 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646828 kB' 'KernelStack: 21920 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:57 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:57 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.671 11:26:57 -- setup/common.sh@33 -- # echo 0 00:03:28.671 11:26:57 -- setup/common.sh@33 -- # return 0 00:03:28.671 11:26:58 -- setup/hugepages.sh@99 -- # surp=0 00:03:28.671 11:26:58 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:28.671 11:26:58 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:28.671 11:26:58 -- setup/common.sh@18 -- # local node= 00:03:28.671 11:26:58 -- setup/common.sh@19 -- # local var val 00:03:28.671 11:26:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.671 11:26:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.671 11:26:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.671 11:26:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.671 11:26:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.671 11:26:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37919296 kB' 'MemAvailable: 39851764 kB' 'Buffers: 2704 kB' 'Cached: 15988452 kB' 'SwapCached: 28 kB' 'Active: 15119428 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588612 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633004 kB' 'Mapped: 213772 kB' 'Shmem: 13991072 kB' 'KReclaimable: 580080 kB' 'Slab: 1226908 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646828 kB' 'KernelStack: 21920 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029888 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.671 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.671 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.672 11:26:58 -- setup/common.sh@33 -- # echo 0 00:03:28.672 11:26:58 -- setup/common.sh@33 -- # return 0 00:03:28.672 11:26:58 -- setup/hugepages.sh@100 -- # resv=0 00:03:28.672 11:26:58 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:28.672 nr_hugepages=1024 00:03:28.672 11:26:58 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:28.672 resv_hugepages=0 00:03:28.672 11:26:58 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:28.672 surplus_hugepages=0 00:03:28.672 11:26:58 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:28.672 anon_hugepages=0 00:03:28.672 11:26:58 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:28.672 11:26:58 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:28.672 11:26:58 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:28.672 11:26:58 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:28.672 11:26:58 -- setup/common.sh@18 -- # local node= 00:03:28.672 11:26:58 -- setup/common.sh@19 -- # local var val 00:03:28.672 11:26:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.672 11:26:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.672 11:26:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.672 11:26:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.672 11:26:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.672 11:26:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 37920300 kB' 'MemAvailable: 39852768 kB' 'Buffers: 2704 kB' 'Cached: 15988464 kB' 'SwapCached: 28 kB' 'Active: 15119464 kB' 'Inactive: 1501480 kB' 'Active(anon): 14588648 kB' 'Inactive(anon): 32212 kB' 'Active(file): 530816 kB' 'Inactive(file): 1469268 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386812 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 633012 kB' 'Mapped: 213772 kB' 'Shmem: 13991084 kB' 'KReclaimable: 580080 kB' 'Slab: 1226908 kB' 'SReclaimable: 580080 kB' 'SUnreclaim: 646828 kB' 'KernelStack: 21920 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 16029904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216420 kB' 'VmallocChunk: 0 kB' 'Percpu: 107520 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3208564 kB' 'DirectMap2M: 52051968 kB' 'DirectMap1G: 13631488 kB' 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.672 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.672 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.673 11:26:58 -- setup/common.sh@33 -- # echo 1024 00:03:28.673 11:26:58 -- setup/common.sh@33 -- # return 0 00:03:28.673 11:26:58 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:28.673 11:26:58 -- setup/hugepages.sh@112 -- # get_nodes 00:03:28.673 11:26:58 -- setup/hugepages.sh@27 -- # local node 00:03:28.673 11:26:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:28.673 11:26:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:28.673 11:26:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:28.673 11:26:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:28.673 11:26:58 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:28.673 11:26:58 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:28.673 11:26:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:28.673 11:26:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:28.673 11:26:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:28.673 11:26:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.673 11:26:58 -- setup/common.sh@18 -- # local node=0 00:03:28.673 11:26:58 -- setup/common.sh@19 -- # local var val 00:03:28.673 11:26:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.673 11:26:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.673 11:26:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:28.673 11:26:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:28.673 11:26:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.673 11:26:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 22974180 kB' 'MemUsed: 9617904 kB' 'SwapCached: 16 kB' 'Active: 5870168 kB' 'Inactive: 349704 kB' 'Active(anon): 5615080 kB' 'Inactive(anon): 16 kB' 'Active(file): 255088 kB' 'Inactive(file): 349688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5833568 kB' 'Mapped: 102036 kB' 'AnonPages: 389400 kB' 'Shmem: 5228776 kB' 'KernelStack: 11896 kB' 'PageTables: 4628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347144 kB' 'Slab: 634196 kB' 'SReclaimable: 347144 kB' 'SUnreclaim: 287052 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.673 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.673 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # continue 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.674 11:26:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.674 11:26:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.674 11:26:58 -- setup/common.sh@33 -- # echo 0 00:03:28.674 11:26:58 -- setup/common.sh@33 -- # return 0 00:03:28.674 11:26:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:28.674 11:26:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:28.674 11:26:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:28.674 11:26:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:28.674 11:26:58 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:28.674 node0=1024 expecting 1024 00:03:28.933 11:26:58 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:28.933 00:03:28.933 real 0m6.760s 00:03:28.933 user 0m2.456s 00:03:28.933 sys 0m4.389s 00:03:28.933 11:26:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:28.933 11:26:58 -- common/autotest_common.sh@10 -- # set +x 00:03:28.933 ************************************ 00:03:28.933 END TEST no_shrink_alloc 00:03:28.933 ************************************ 00:03:28.933 11:26:58 -- setup/hugepages.sh@217 -- # clear_hp 00:03:28.933 11:26:58 -- setup/hugepages.sh@37 -- # local node hp 00:03:28.933 11:26:58 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:28.933 11:26:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:28.933 11:26:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:28.933 11:26:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:28.933 11:26:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:28.933 11:26:58 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:28.933 11:26:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:28.933 11:26:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:28.933 11:26:58 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:28.933 11:26:58 -- setup/hugepages.sh@41 -- # echo 0 00:03:28.933 11:26:58 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:28.933 11:26:58 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:28.933 00:03:28.933 real 0m26.371s 00:03:28.933 user 0m9.018s 00:03:28.933 sys 0m16.261s 00:03:28.933 11:26:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:28.933 11:26:58 -- common/autotest_common.sh@10 -- # set +x 00:03:28.933 ************************************ 00:03:28.933 END TEST hugepages 00:03:28.933 ************************************ 00:03:28.933 11:26:58 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:28.933 11:26:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:28.933 11:26:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:28.933 11:26:58 -- common/autotest_common.sh@10 -- # set +x 00:03:28.933 ************************************ 00:03:28.933 START TEST driver 00:03:28.933 ************************************ 00:03:28.933 11:26:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:28.933 * Looking for test storage... 00:03:28.933 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:28.933 11:26:58 -- setup/driver.sh@68 -- # setup reset 00:03:28.933 11:26:58 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:28.933 11:26:58 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:34.201 11:27:02 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:34.201 11:27:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:34.201 11:27:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:34.201 11:27:02 -- common/autotest_common.sh@10 -- # set +x 00:03:34.201 ************************************ 00:03:34.201 START TEST guess_driver 00:03:34.201 ************************************ 00:03:34.201 11:27:02 -- common/autotest_common.sh@1104 -- # guess_driver 00:03:34.201 11:27:02 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:34.201 11:27:02 -- setup/driver.sh@47 -- # local fail=0 00:03:34.201 11:27:02 -- setup/driver.sh@49 -- # pick_driver 00:03:34.201 11:27:02 -- setup/driver.sh@36 -- # vfio 00:03:34.201 11:27:02 -- setup/driver.sh@21 -- # local iommu_grups 00:03:34.201 11:27:02 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:34.201 11:27:02 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:34.201 11:27:02 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:34.201 11:27:02 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:34.201 11:27:02 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:03:34.201 11:27:02 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:34.201 11:27:02 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:34.201 11:27:02 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:34.201 11:27:02 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:34.201 11:27:02 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:34.201 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:34.201 11:27:02 -- setup/driver.sh@30 -- # return 0 00:03:34.201 11:27:02 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:34.201 11:27:02 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:34.201 11:27:02 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:34.201 11:27:02 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:34.201 Looking for driver=vfio-pci 00:03:34.201 11:27:02 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:34.201 11:27:02 -- setup/driver.sh@45 -- # setup output config 00:03:34.201 11:27:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.201 11:27:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:36.733 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.733 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.733 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.733 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.733 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.734 11:27:05 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.734 11:27:05 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.734 11:27:05 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:38.111 11:27:07 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:38.111 11:27:07 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:38.111 11:27:07 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:38.111 11:27:07 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:38.111 11:27:07 -- setup/driver.sh@65 -- # setup reset 00:03:38.111 11:27:07 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:38.111 11:27:07 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:43.384 00:03:43.384 real 0m9.451s 00:03:43.384 user 0m2.252s 00:03:43.384 sys 0m4.870s 00:03:43.384 11:27:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:43.384 11:27:12 -- common/autotest_common.sh@10 -- # set +x 00:03:43.384 ************************************ 00:03:43.384 END TEST guess_driver 00:03:43.384 ************************************ 00:03:43.384 00:03:43.384 real 0m14.249s 00:03:43.384 user 0m3.527s 00:03:43.384 sys 0m7.631s 00:03:43.384 11:27:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:43.384 11:27:12 -- common/autotest_common.sh@10 -- # set +x 00:03:43.384 ************************************ 00:03:43.384 END TEST driver 00:03:43.384 ************************************ 00:03:43.384 11:27:12 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:43.384 11:27:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:43.384 11:27:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:43.384 11:27:12 -- common/autotest_common.sh@10 -- # set +x 00:03:43.384 ************************************ 00:03:43.384 START TEST devices 00:03:43.384 ************************************ 00:03:43.384 11:27:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:43.384 * Looking for test storage... 00:03:43.384 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:43.384 11:27:12 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:03:43.384 11:27:12 -- setup/devices.sh@192 -- # setup reset 00:03:43.384 11:27:12 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:43.385 11:27:12 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:47.679 11:27:16 -- setup/devices.sh@194 -- # get_zoned_devs 00:03:47.679 11:27:16 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:47.679 11:27:16 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:47.679 11:27:16 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:47.679 11:27:16 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:47.679 11:27:16 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:47.679 11:27:16 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:47.679 11:27:16 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:47.679 11:27:16 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:47.679 11:27:16 -- setup/devices.sh@196 -- # blocks=() 00:03:47.679 11:27:16 -- setup/devices.sh@196 -- # declare -a blocks 00:03:47.679 11:27:16 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:03:47.679 11:27:16 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:03:47.679 11:27:16 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:03:47.679 11:27:16 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:47.679 11:27:16 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:03:47.679 11:27:16 -- setup/devices.sh@201 -- # ctrl=nvme0 00:03:47.679 11:27:16 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:03:47.679 11:27:16 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:47.679 11:27:16 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:03:47.679 11:27:16 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:03:47.679 11:27:16 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:03:47.679 No valid GPT data, bailing 00:03:47.679 11:27:16 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:47.679 11:27:16 -- scripts/common.sh@393 -- # pt= 00:03:47.679 11:27:16 -- scripts/common.sh@394 -- # return 1 00:03:47.679 11:27:16 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:03:47.679 11:27:16 -- setup/common.sh@76 -- # local dev=nvme0n1 00:03:47.679 11:27:16 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:03:47.679 11:27:16 -- setup/common.sh@80 -- # echo 1600321314816 00:03:47.679 11:27:16 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:03:47.679 11:27:16 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:47.679 11:27:16 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:03:47.679 11:27:16 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:03:47.679 11:27:16 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:03:47.679 11:27:16 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:03:47.679 11:27:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:47.679 11:27:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:47.679 11:27:16 -- common/autotest_common.sh@10 -- # set +x 00:03:47.679 ************************************ 00:03:47.679 START TEST nvme_mount 00:03:47.679 ************************************ 00:03:47.679 11:27:16 -- common/autotest_common.sh@1104 -- # nvme_mount 00:03:47.679 11:27:16 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:03:47.679 11:27:16 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:03:47.679 11:27:16 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:47.679 11:27:16 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:47.679 11:27:16 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:03:47.679 11:27:16 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:47.680 11:27:16 -- setup/common.sh@40 -- # local part_no=1 00:03:47.680 11:27:16 -- setup/common.sh@41 -- # local size=1073741824 00:03:47.680 11:27:16 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:47.680 11:27:16 -- setup/common.sh@44 -- # parts=() 00:03:47.680 11:27:16 -- setup/common.sh@44 -- # local parts 00:03:47.680 11:27:16 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:47.680 11:27:16 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:47.680 11:27:16 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:47.680 11:27:16 -- setup/common.sh@46 -- # (( part++ )) 00:03:47.680 11:27:16 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:47.680 11:27:16 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:47.680 11:27:16 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:03:47.680 11:27:16 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:48.249 Creating new GPT entries in memory. 00:03:48.249 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:48.249 other utilities. 00:03:48.249 11:27:17 -- setup/common.sh@57 -- # (( part = 1 )) 00:03:48.249 11:27:17 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:48.249 11:27:17 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:48.249 11:27:17 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:48.249 11:27:17 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:49.187 Creating new GPT entries in memory. 00:03:49.187 The operation has completed successfully. 00:03:49.187 11:27:18 -- setup/common.sh@57 -- # (( part++ )) 00:03:49.187 11:27:18 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:49.187 11:27:18 -- setup/common.sh@62 -- # wait 2020253 00:03:49.187 11:27:18 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.187 11:27:18 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:03:49.187 11:27:18 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.187 11:27:18 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:03:49.187 11:27:18 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:03:49.187 11:27:18 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.446 11:27:18 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:49.446 11:27:18 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:49.446 11:27:18 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:03:49.446 11:27:18 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.446 11:27:18 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:49.446 11:27:18 -- setup/devices.sh@53 -- # local found=0 00:03:49.446 11:27:18 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:49.446 11:27:18 -- setup/devices.sh@56 -- # : 00:03:49.446 11:27:18 -- setup/devices.sh@59 -- # local pci status 00:03:49.446 11:27:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.446 11:27:18 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:49.446 11:27:18 -- setup/devices.sh@47 -- # setup output config 00:03:49.446 11:27:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:49.446 11:27:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:03:52.730 11:27:21 -- setup/devices.sh@63 -- # found=1 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:21 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:52.730 11:27:21 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:52.730 11:27:21 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:52.730 11:27:21 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:52.730 11:27:21 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:52.730 11:27:21 -- setup/devices.sh@110 -- # cleanup_nvme 00:03:52.730 11:27:21 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:52.730 11:27:21 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:52.730 11:27:21 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:03:52.730 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:52.730 11:27:21 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:52.730 11:27:21 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:52.730 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:03:52.730 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:03:52.730 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:03:52.730 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:03:52.730 11:27:22 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:03:52.730 11:27:22 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:03:52.730 11:27:22 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:52.730 11:27:22 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:03:52.730 11:27:22 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:03:52.730 11:27:22 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:52.730 11:27:22 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:52.730 11:27:22 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:52.730 11:27:22 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:03:52.730 11:27:22 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:52.730 11:27:22 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:52.730 11:27:22 -- setup/devices.sh@53 -- # local found=0 00:03:52.730 11:27:22 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:52.730 11:27:22 -- setup/devices.sh@56 -- # : 00:03:52.730 11:27:22 -- setup/devices.sh@59 -- # local pci status 00:03:52.730 11:27:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:52.730 11:27:22 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:52.730 11:27:22 -- setup/devices.sh@47 -- # setup output config 00:03:52.730 11:27:22 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.730 11:27:22 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:03:56.016 11:27:25 -- setup/devices.sh@63 -- # found=1 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:56.016 11:27:25 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:56.016 11:27:25 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:56.016 11:27:25 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:56.016 11:27:25 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:56.016 11:27:25 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:56.016 11:27:25 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:03:56.016 11:27:25 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:56.016 11:27:25 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:03:56.016 11:27:25 -- setup/devices.sh@50 -- # local mount_point= 00:03:56.016 11:27:25 -- setup/devices.sh@51 -- # local test_file= 00:03:56.016 11:27:25 -- setup/devices.sh@53 -- # local found=0 00:03:56.016 11:27:25 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:03:56.016 11:27:25 -- setup/devices.sh@59 -- # local pci status 00:03:56.016 11:27:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.016 11:27:25 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:56.016 11:27:25 -- setup/devices.sh@47 -- # setup output config 00:03:56.016 11:27:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.016 11:27:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:03:59.302 11:27:28 -- setup/devices.sh@63 -- # found=1 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.302 11:27:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:59.302 11:27:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.560 11:27:28 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:59.560 11:27:28 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:03:59.560 11:27:28 -- setup/devices.sh@68 -- # return 0 00:03:59.560 11:27:28 -- setup/devices.sh@128 -- # cleanup_nvme 00:03:59.560 11:27:28 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:59.560 11:27:28 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:59.560 11:27:28 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:59.560 11:27:28 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:59.560 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:59.560 00:03:59.560 real 0m12.262s 00:03:59.560 user 0m3.448s 00:03:59.560 sys 0m6.725s 00:03:59.560 11:27:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:59.560 11:27:28 -- common/autotest_common.sh@10 -- # set +x 00:03:59.560 ************************************ 00:03:59.560 END TEST nvme_mount 00:03:59.560 ************************************ 00:03:59.560 11:27:28 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:03:59.560 11:27:28 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:59.560 11:27:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:59.561 11:27:28 -- common/autotest_common.sh@10 -- # set +x 00:03:59.561 ************************************ 00:03:59.561 START TEST dm_mount 00:03:59.561 ************************************ 00:03:59.561 11:27:28 -- common/autotest_common.sh@1104 -- # dm_mount 00:03:59.561 11:27:28 -- setup/devices.sh@144 -- # pv=nvme0n1 00:03:59.561 11:27:28 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:03:59.561 11:27:28 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:03:59.561 11:27:28 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:03:59.561 11:27:28 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:59.561 11:27:28 -- setup/common.sh@40 -- # local part_no=2 00:03:59.561 11:27:28 -- setup/common.sh@41 -- # local size=1073741824 00:03:59.561 11:27:28 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:59.561 11:27:28 -- setup/common.sh@44 -- # parts=() 00:03:59.561 11:27:28 -- setup/common.sh@44 -- # local parts 00:03:59.561 11:27:28 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:59.561 11:27:28 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:59.561 11:27:28 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:59.561 11:27:28 -- setup/common.sh@46 -- # (( part++ )) 00:03:59.561 11:27:28 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:59.561 11:27:28 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:59.561 11:27:28 -- setup/common.sh@46 -- # (( part++ )) 00:03:59.561 11:27:28 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:59.561 11:27:28 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:59.561 11:27:28 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:59.561 11:27:28 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:00.511 Creating new GPT entries in memory. 00:04:00.511 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:00.511 other utilities. 00:04:00.511 11:27:29 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:00.511 11:27:29 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:00.511 11:27:29 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:00.511 11:27:29 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:00.511 11:27:29 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:01.446 Creating new GPT entries in memory. 00:04:01.446 The operation has completed successfully. 00:04:01.446 11:27:30 -- setup/common.sh@57 -- # (( part++ )) 00:04:01.446 11:27:30 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:01.446 11:27:30 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:01.446 11:27:30 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:01.446 11:27:30 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:02.820 The operation has completed successfully. 00:04:02.820 11:27:31 -- setup/common.sh@57 -- # (( part++ )) 00:04:02.820 11:27:31 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:02.820 11:27:31 -- setup/common.sh@62 -- # wait 2024752 00:04:02.820 11:27:31 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:02.820 11:27:31 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:02.820 11:27:31 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:02.820 11:27:31 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:02.820 11:27:31 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:02.820 11:27:31 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:02.820 11:27:31 -- setup/devices.sh@161 -- # break 00:04:02.820 11:27:31 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:02.820 11:27:31 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:02.820 11:27:31 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:02.820 11:27:31 -- setup/devices.sh@166 -- # dm=dm-0 00:04:02.820 11:27:31 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:02.820 11:27:31 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:02.820 11:27:31 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:02.820 11:27:31 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:02.820 11:27:31 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:02.820 11:27:31 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:02.820 11:27:31 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:02.820 11:27:31 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:02.821 11:27:31 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:02.821 11:27:31 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:02.821 11:27:32 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:02.821 11:27:32 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:02.821 11:27:32 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:02.821 11:27:32 -- setup/devices.sh@53 -- # local found=0 00:04:02.821 11:27:32 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:02.821 11:27:32 -- setup/devices.sh@56 -- # : 00:04:02.821 11:27:32 -- setup/devices.sh@59 -- # local pci status 00:04:02.821 11:27:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:02.821 11:27:32 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:02.821 11:27:32 -- setup/devices.sh@47 -- # setup output config 00:04:02.821 11:27:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.821 11:27:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:06.111 11:27:34 -- setup/devices.sh@63 -- # found=1 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:34 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.111 11:27:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:35 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:06.111 11:27:35 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:06.111 11:27:35 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:06.111 11:27:35 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:06.111 11:27:35 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:06.111 11:27:35 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:06.111 11:27:35 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:06.111 11:27:35 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:06.111 11:27:35 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:06.111 11:27:35 -- setup/devices.sh@50 -- # local mount_point= 00:04:06.111 11:27:35 -- setup/devices.sh@51 -- # local test_file= 00:04:06.111 11:27:35 -- setup/devices.sh@53 -- # local found=0 00:04:06.111 11:27:35 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:06.111 11:27:35 -- setup/devices.sh@59 -- # local pci status 00:04:06.111 11:27:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.111 11:27:35 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:06.111 11:27:35 -- setup/devices.sh@47 -- # setup output config 00:04:06.111 11:27:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.111 11:27:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:09.396 11:27:38 -- setup/devices.sh@63 -- # found=1 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:09.396 11:27:38 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:09.396 11:27:38 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:09.396 11:27:38 -- setup/devices.sh@68 -- # return 0 00:04:09.396 11:27:38 -- setup/devices.sh@187 -- # cleanup_dm 00:04:09.396 11:27:38 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:09.396 11:27:38 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:09.396 11:27:38 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:09.396 11:27:38 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:09.396 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:09.396 11:27:38 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:09.396 00:04:09.396 real 0m9.649s 00:04:09.396 user 0m2.244s 00:04:09.396 sys 0m4.467s 00:04:09.396 11:27:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:09.396 11:27:38 -- common/autotest_common.sh@10 -- # set +x 00:04:09.396 ************************************ 00:04:09.396 END TEST dm_mount 00:04:09.396 ************************************ 00:04:09.396 11:27:38 -- setup/devices.sh@1 -- # cleanup 00:04:09.396 11:27:38 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:09.396 11:27:38 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:09.396 11:27:38 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:09.396 11:27:38 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:09.396 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:09.396 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:09.396 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:09.396 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:09.396 11:27:38 -- setup/devices.sh@12 -- # cleanup_dm 00:04:09.396 11:27:38 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:09.396 11:27:38 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:09.396 11:27:38 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:09.396 11:27:38 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:09.396 00:04:09.396 real 0m26.309s 00:04:09.396 user 0m7.096s 00:04:09.396 sys 0m14.104s 00:04:09.396 11:27:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:09.396 11:27:38 -- common/autotest_common.sh@10 -- # set +x 00:04:09.396 ************************************ 00:04:09.396 END TEST devices 00:04:09.396 ************************************ 00:04:09.655 00:04:09.655 real 1m30.757s 00:04:09.655 user 0m26.870s 00:04:09.655 sys 0m52.787s 00:04:09.655 11:27:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:09.655 11:27:38 -- common/autotest_common.sh@10 -- # set +x 00:04:09.655 ************************************ 00:04:09.655 END TEST setup.sh 00:04:09.655 ************************************ 00:04:09.655 11:27:38 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:12.938 Hugepages 00:04:12.938 node hugesize free / total 00:04:12.938 node0 1048576kB 0 / 0 00:04:12.938 node0 2048kB 2048 / 2048 00:04:12.938 node1 1048576kB 0 / 0 00:04:12.938 node1 2048kB 0 / 0 00:04:12.938 00:04:12.938 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:12.938 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:12.938 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:12.938 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:12.938 11:27:42 -- spdk/autotest.sh@141 -- # uname -s 00:04:12.938 11:27:42 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:04:12.938 11:27:42 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:04:12.938 11:27:42 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:16.229 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:16.229 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:16.229 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:16.229 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:16.230 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:16.487 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:16.487 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:17.866 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:17.866 11:27:47 -- common/autotest_common.sh@1517 -- # sleep 1 00:04:19.241 11:27:48 -- common/autotest_common.sh@1518 -- # bdfs=() 00:04:19.241 11:27:48 -- common/autotest_common.sh@1518 -- # local bdfs 00:04:19.241 11:27:48 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:04:19.241 11:27:48 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:04:19.241 11:27:48 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:19.241 11:27:48 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:19.241 11:27:48 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:19.241 11:27:48 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:19.241 11:27:48 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:19.241 11:27:48 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:19.241 11:27:48 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:19.241 11:27:48 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:22.531 Waiting for block devices as requested 00:04:22.531 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:22.531 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:22.531 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:22.531 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:22.531 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:22.531 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:22.790 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:22.790 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:22.790 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:23.049 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:23.049 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:23.049 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:23.308 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:23.308 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:23.308 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:23.567 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:23.567 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:23.826 11:27:52 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:04:23.826 11:27:52 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:23.826 11:27:52 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:04:23.826 11:27:52 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:23.826 11:27:53 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:04:23.826 11:27:53 -- common/autotest_common.sh@1530 -- # grep oacs 00:04:23.826 11:27:53 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:04:23.826 11:27:53 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:04:23.826 11:27:53 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:04:23.826 11:27:53 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:04:23.826 11:27:53 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:04:23.826 11:27:53 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:04:23.826 11:27:53 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:04:23.826 11:27:53 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:04:23.826 11:27:53 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:04:23.826 11:27:53 -- common/autotest_common.sh@1542 -- # continue 00:04:23.826 11:27:53 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:04:23.826 11:27:53 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:23.826 11:27:53 -- common/autotest_common.sh@10 -- # set +x 00:04:23.826 11:27:53 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:04:23.826 11:27:53 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:23.826 11:27:53 -- common/autotest_common.sh@10 -- # set +x 00:04:23.826 11:27:53 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:27.206 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:27.206 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:28.615 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:28.615 11:27:57 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:04:28.615 11:27:57 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:28.615 11:27:57 -- common/autotest_common.sh@10 -- # set +x 00:04:28.615 11:27:57 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:04:28.615 11:27:57 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:04:28.615 11:27:57 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:04:28.615 11:27:57 -- common/autotest_common.sh@1562 -- # bdfs=() 00:04:28.615 11:27:57 -- common/autotest_common.sh@1562 -- # local bdfs 00:04:28.615 11:27:57 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:04:28.615 11:27:57 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:28.615 11:27:57 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:28.615 11:27:57 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:28.615 11:27:57 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:28.615 11:27:57 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:28.874 11:27:58 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:28.874 11:27:58 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:28.874 11:27:58 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:04:28.874 11:27:58 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:28.874 11:27:58 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:04:28.874 11:27:58 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:28.874 11:27:58 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:04:28.874 11:27:58 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:04:28.874 11:27:58 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:04:28.874 11:27:58 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=2034446 00:04:28.874 11:27:58 -- common/autotest_common.sh@1583 -- # waitforlisten 2034446 00:04:28.874 11:27:58 -- common/autotest_common.sh@819 -- # '[' -z 2034446 ']' 00:04:28.874 11:27:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:28.874 11:27:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:28.874 11:27:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:28.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:28.874 11:27:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:28.874 11:27:58 -- common/autotest_common.sh@10 -- # set +x 00:04:28.874 11:27:58 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:28.874 [2024-07-21 11:27:58.110963] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:28.874 [2024-07-21 11:27:58.111059] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034446 ] 00:04:28.874 EAL: No free 2048 kB hugepages reported on node 1 00:04:28.874 [2024-07-21 11:27:58.181766] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:28.874 [2024-07-21 11:27:58.222322] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:28.874 [2024-07-21 11:27:58.222439] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:29.810 11:27:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:29.810 11:27:58 -- common/autotest_common.sh@852 -- # return 0 00:04:29.810 11:27:58 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:04:29.810 11:27:58 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:04:29.810 11:27:58 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:04:33.094 nvme0n1 00:04:33.094 11:28:01 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:33.094 [2024-07-21 11:28:02.019875] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:33.094 request: 00:04:33.094 { 00:04:33.094 "nvme_ctrlr_name": "nvme0", 00:04:33.094 "password": "test", 00:04:33.094 "method": "bdev_nvme_opal_revert", 00:04:33.094 "req_id": 1 00:04:33.094 } 00:04:33.094 Got JSON-RPC error response 00:04:33.094 response: 00:04:33.094 { 00:04:33.094 "code": -32602, 00:04:33.094 "message": "Invalid parameters" 00:04:33.094 } 00:04:33.094 11:28:02 -- common/autotest_common.sh@1589 -- # true 00:04:33.094 11:28:02 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:04:33.094 11:28:02 -- common/autotest_common.sh@1593 -- # killprocess 2034446 00:04:33.094 11:28:02 -- common/autotest_common.sh@926 -- # '[' -z 2034446 ']' 00:04:33.094 11:28:02 -- common/autotest_common.sh@930 -- # kill -0 2034446 00:04:33.094 11:28:02 -- common/autotest_common.sh@931 -- # uname 00:04:33.094 11:28:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:33.094 11:28:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2034446 00:04:33.094 11:28:02 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:33.094 11:28:02 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:33.094 11:28:02 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2034446' 00:04:33.094 killing process with pid 2034446 00:04:33.094 11:28:02 -- common/autotest_common.sh@945 -- # kill 2034446 00:04:33.094 11:28:02 -- common/autotest_common.sh@950 -- # wait 2034446 00:04:34.998 11:28:04 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:04:34.998 11:28:04 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:04:34.998 11:28:04 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:04:34.998 11:28:04 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:04:34.998 11:28:04 -- spdk/autotest.sh@173 -- # timing_enter lib 00:04:34.998 11:28:04 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:34.998 11:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:34.998 11:28:04 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:34.998 11:28:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:34.998 11:28:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:34.998 11:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:34.998 ************************************ 00:04:34.998 START TEST env 00:04:34.998 ************************************ 00:04:34.998 11:28:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:34.998 * Looking for test storage... 00:04:34.998 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:04:34.998 11:28:04 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:34.998 11:28:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:34.998 11:28:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:34.998 11:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:34.998 ************************************ 00:04:34.998 START TEST env_memory 00:04:34.998 ************************************ 00:04:34.998 11:28:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:34.998 00:04:34.998 00:04:34.998 CUnit - A unit testing framework for C - Version 2.1-3 00:04:34.998 http://cunit.sourceforge.net/ 00:04:34.998 00:04:34.998 00:04:34.998 Suite: memory 00:04:34.998 Test: alloc and free memory map ...[2024-07-21 11:28:04.385312] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:34.998 passed 00:04:34.998 Test: mem map translation ...[2024-07-21 11:28:04.399096] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:34.998 [2024-07-21 11:28:04.399113] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:34.998 [2024-07-21 11:28:04.399147] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:34.998 [2024-07-21 11:28:04.399156] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:34.998 passed 00:04:34.998 Test: mem map registration ...[2024-07-21 11:28:04.421698] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:34.998 [2024-07-21 11:28:04.421716] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:35.258 passed 00:04:35.258 Test: mem map adjacent registrations ...passed 00:04:35.258 00:04:35.258 Run Summary: Type Total Ran Passed Failed Inactive 00:04:35.258 suites 1 1 n/a 0 0 00:04:35.258 tests 4 4 4 0 0 00:04:35.258 asserts 152 152 152 0 n/a 00:04:35.258 00:04:35.258 Elapsed time = 0.093 seconds 00:04:35.258 00:04:35.258 real 0m0.106s 00:04:35.258 user 0m0.093s 00:04:35.258 sys 0m0.013s 00:04:35.258 11:28:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:35.258 11:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:35.258 ************************************ 00:04:35.258 END TEST env_memory 00:04:35.258 ************************************ 00:04:35.258 11:28:04 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:35.258 11:28:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:35.258 11:28:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:35.258 11:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:35.258 ************************************ 00:04:35.258 START TEST env_vtophys 00:04:35.258 ************************************ 00:04:35.258 11:28:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:35.258 EAL: lib.eal log level changed from notice to debug 00:04:35.258 EAL: Detected lcore 0 as core 0 on socket 0 00:04:35.258 EAL: Detected lcore 1 as core 1 on socket 0 00:04:35.258 EAL: Detected lcore 2 as core 2 on socket 0 00:04:35.258 EAL: Detected lcore 3 as core 3 on socket 0 00:04:35.258 EAL: Detected lcore 4 as core 4 on socket 0 00:04:35.258 EAL: Detected lcore 5 as core 5 on socket 0 00:04:35.258 EAL: Detected lcore 6 as core 6 on socket 0 00:04:35.258 EAL: Detected lcore 7 as core 8 on socket 0 00:04:35.258 EAL: Detected lcore 8 as core 9 on socket 0 00:04:35.258 EAL: Detected lcore 9 as core 10 on socket 0 00:04:35.258 EAL: Detected lcore 10 as core 11 on socket 0 00:04:35.258 EAL: Detected lcore 11 as core 12 on socket 0 00:04:35.258 EAL: Detected lcore 12 as core 13 on socket 0 00:04:35.258 EAL: Detected lcore 13 as core 14 on socket 0 00:04:35.258 EAL: Detected lcore 14 as core 16 on socket 0 00:04:35.258 EAL: Detected lcore 15 as core 17 on socket 0 00:04:35.258 EAL: Detected lcore 16 as core 18 on socket 0 00:04:35.258 EAL: Detected lcore 17 as core 19 on socket 0 00:04:35.258 EAL: Detected lcore 18 as core 20 on socket 0 00:04:35.258 EAL: Detected lcore 19 as core 21 on socket 0 00:04:35.258 EAL: Detected lcore 20 as core 22 on socket 0 00:04:35.258 EAL: Detected lcore 21 as core 24 on socket 0 00:04:35.258 EAL: Detected lcore 22 as core 25 on socket 0 00:04:35.258 EAL: Detected lcore 23 as core 26 on socket 0 00:04:35.258 EAL: Detected lcore 24 as core 27 on socket 0 00:04:35.258 EAL: Detected lcore 25 as core 28 on socket 0 00:04:35.258 EAL: Detected lcore 26 as core 29 on socket 0 00:04:35.258 EAL: Detected lcore 27 as core 30 on socket 0 00:04:35.258 EAL: Detected lcore 28 as core 0 on socket 1 00:04:35.258 EAL: Detected lcore 29 as core 1 on socket 1 00:04:35.258 EAL: Detected lcore 30 as core 2 on socket 1 00:04:35.258 EAL: Detected lcore 31 as core 3 on socket 1 00:04:35.258 EAL: Detected lcore 32 as core 4 on socket 1 00:04:35.258 EAL: Detected lcore 33 as core 5 on socket 1 00:04:35.258 EAL: Detected lcore 34 as core 6 on socket 1 00:04:35.258 EAL: Detected lcore 35 as core 8 on socket 1 00:04:35.258 EAL: Detected lcore 36 as core 9 on socket 1 00:04:35.258 EAL: Detected lcore 37 as core 10 on socket 1 00:04:35.258 EAL: Detected lcore 38 as core 11 on socket 1 00:04:35.258 EAL: Detected lcore 39 as core 12 on socket 1 00:04:35.258 EAL: Detected lcore 40 as core 13 on socket 1 00:04:35.258 EAL: Detected lcore 41 as core 14 on socket 1 00:04:35.258 EAL: Detected lcore 42 as core 16 on socket 1 00:04:35.258 EAL: Detected lcore 43 as core 17 on socket 1 00:04:35.258 EAL: Detected lcore 44 as core 18 on socket 1 00:04:35.258 EAL: Detected lcore 45 as core 19 on socket 1 00:04:35.258 EAL: Detected lcore 46 as core 20 on socket 1 00:04:35.258 EAL: Detected lcore 47 as core 21 on socket 1 00:04:35.258 EAL: Detected lcore 48 as core 22 on socket 1 00:04:35.259 EAL: Detected lcore 49 as core 24 on socket 1 00:04:35.259 EAL: Detected lcore 50 as core 25 on socket 1 00:04:35.259 EAL: Detected lcore 51 as core 26 on socket 1 00:04:35.259 EAL: Detected lcore 52 as core 27 on socket 1 00:04:35.259 EAL: Detected lcore 53 as core 28 on socket 1 00:04:35.259 EAL: Detected lcore 54 as core 29 on socket 1 00:04:35.259 EAL: Detected lcore 55 as core 30 on socket 1 00:04:35.259 EAL: Detected lcore 56 as core 0 on socket 0 00:04:35.259 EAL: Detected lcore 57 as core 1 on socket 0 00:04:35.259 EAL: Detected lcore 58 as core 2 on socket 0 00:04:35.259 EAL: Detected lcore 59 as core 3 on socket 0 00:04:35.259 EAL: Detected lcore 60 as core 4 on socket 0 00:04:35.259 EAL: Detected lcore 61 as core 5 on socket 0 00:04:35.259 EAL: Detected lcore 62 as core 6 on socket 0 00:04:35.259 EAL: Detected lcore 63 as core 8 on socket 0 00:04:35.259 EAL: Detected lcore 64 as core 9 on socket 0 00:04:35.259 EAL: Detected lcore 65 as core 10 on socket 0 00:04:35.259 EAL: Detected lcore 66 as core 11 on socket 0 00:04:35.259 EAL: Detected lcore 67 as core 12 on socket 0 00:04:35.259 EAL: Detected lcore 68 as core 13 on socket 0 00:04:35.259 EAL: Detected lcore 69 as core 14 on socket 0 00:04:35.259 EAL: Detected lcore 70 as core 16 on socket 0 00:04:35.259 EAL: Detected lcore 71 as core 17 on socket 0 00:04:35.259 EAL: Detected lcore 72 as core 18 on socket 0 00:04:35.259 EAL: Detected lcore 73 as core 19 on socket 0 00:04:35.259 EAL: Detected lcore 74 as core 20 on socket 0 00:04:35.259 EAL: Detected lcore 75 as core 21 on socket 0 00:04:35.259 EAL: Detected lcore 76 as core 22 on socket 0 00:04:35.259 EAL: Detected lcore 77 as core 24 on socket 0 00:04:35.259 EAL: Detected lcore 78 as core 25 on socket 0 00:04:35.259 EAL: Detected lcore 79 as core 26 on socket 0 00:04:35.259 EAL: Detected lcore 80 as core 27 on socket 0 00:04:35.259 EAL: Detected lcore 81 as core 28 on socket 0 00:04:35.259 EAL: Detected lcore 82 as core 29 on socket 0 00:04:35.259 EAL: Detected lcore 83 as core 30 on socket 0 00:04:35.259 EAL: Detected lcore 84 as core 0 on socket 1 00:04:35.259 EAL: Detected lcore 85 as core 1 on socket 1 00:04:35.259 EAL: Detected lcore 86 as core 2 on socket 1 00:04:35.259 EAL: Detected lcore 87 as core 3 on socket 1 00:04:35.259 EAL: Detected lcore 88 as core 4 on socket 1 00:04:35.259 EAL: Detected lcore 89 as core 5 on socket 1 00:04:35.259 EAL: Detected lcore 90 as core 6 on socket 1 00:04:35.259 EAL: Detected lcore 91 as core 8 on socket 1 00:04:35.259 EAL: Detected lcore 92 as core 9 on socket 1 00:04:35.259 EAL: Detected lcore 93 as core 10 on socket 1 00:04:35.259 EAL: Detected lcore 94 as core 11 on socket 1 00:04:35.259 EAL: Detected lcore 95 as core 12 on socket 1 00:04:35.259 EAL: Detected lcore 96 as core 13 on socket 1 00:04:35.259 EAL: Detected lcore 97 as core 14 on socket 1 00:04:35.259 EAL: Detected lcore 98 as core 16 on socket 1 00:04:35.259 EAL: Detected lcore 99 as core 17 on socket 1 00:04:35.259 EAL: Detected lcore 100 as core 18 on socket 1 00:04:35.259 EAL: Detected lcore 101 as core 19 on socket 1 00:04:35.259 EAL: Detected lcore 102 as core 20 on socket 1 00:04:35.259 EAL: Detected lcore 103 as core 21 on socket 1 00:04:35.259 EAL: Detected lcore 104 as core 22 on socket 1 00:04:35.259 EAL: Detected lcore 105 as core 24 on socket 1 00:04:35.259 EAL: Detected lcore 106 as core 25 on socket 1 00:04:35.259 EAL: Detected lcore 107 as core 26 on socket 1 00:04:35.259 EAL: Detected lcore 108 as core 27 on socket 1 00:04:35.259 EAL: Detected lcore 109 as core 28 on socket 1 00:04:35.259 EAL: Detected lcore 110 as core 29 on socket 1 00:04:35.259 EAL: Detected lcore 111 as core 30 on socket 1 00:04:35.259 EAL: Maximum logical cores by configuration: 128 00:04:35.259 EAL: Detected CPU lcores: 112 00:04:35.259 EAL: Detected NUMA nodes: 2 00:04:35.259 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:35.259 EAL: Checking presence of .so 'librte_eal.so.24' 00:04:35.259 EAL: Checking presence of .so 'librte_eal.so' 00:04:35.259 EAL: Detected static linkage of DPDK 00:04:35.259 EAL: No shared files mode enabled, IPC will be disabled 00:04:35.259 EAL: Bus pci wants IOVA as 'DC' 00:04:35.259 EAL: Buses did not request a specific IOVA mode. 00:04:35.259 EAL: IOMMU is available, selecting IOVA as VA mode. 00:04:35.259 EAL: Selected IOVA mode 'VA' 00:04:35.259 EAL: No free 2048 kB hugepages reported on node 1 00:04:35.259 EAL: Probing VFIO support... 00:04:35.259 EAL: IOMMU type 1 (Type 1) is supported 00:04:35.259 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:35.259 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:35.259 EAL: VFIO support initialized 00:04:35.259 EAL: Ask a virtual area of 0x2e000 bytes 00:04:35.259 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:35.259 EAL: Setting up physically contiguous memory... 00:04:35.259 EAL: Setting maximum number of open files to 524288 00:04:35.259 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:35.259 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:35.259 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:35.259 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:35.259 EAL: Ask a virtual area of 0x61000 bytes 00:04:35.259 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:35.259 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:35.259 EAL: Ask a virtual area of 0x400000000 bytes 00:04:35.259 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:35.259 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:35.259 EAL: Hugepages will be freed exactly as allocated. 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: TSC frequency is ~2500000 KHz 00:04:35.259 EAL: Main lcore 0 is ready (tid=7f39de8bea00;cpuset=[0]) 00:04:35.259 EAL: Trying to obtain current memory policy. 00:04:35.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.259 EAL: Restoring previous memory policy: 0 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was expanded by 2MB 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Mem event callback 'spdk:(nil)' registered 00:04:35.259 00:04:35.259 00:04:35.259 CUnit - A unit testing framework for C - Version 2.1-3 00:04:35.259 http://cunit.sourceforge.net/ 00:04:35.259 00:04:35.259 00:04:35.259 Suite: components_suite 00:04:35.259 Test: vtophys_malloc_test ...passed 00:04:35.259 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:35.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.259 EAL: Restoring previous memory policy: 4 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was expanded by 4MB 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was shrunk by 4MB 00:04:35.259 EAL: Trying to obtain current memory policy. 00:04:35.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.259 EAL: Restoring previous memory policy: 4 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was expanded by 6MB 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was shrunk by 6MB 00:04:35.259 EAL: Trying to obtain current memory policy. 00:04:35.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.259 EAL: Restoring previous memory policy: 4 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was expanded by 10MB 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was shrunk by 10MB 00:04:35.259 EAL: Trying to obtain current memory policy. 00:04:35.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.259 EAL: Restoring previous memory policy: 4 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.259 EAL: Heap on socket 0 was expanded by 18MB 00:04:35.259 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.259 EAL: request: mp_malloc_sync 00:04:35.259 EAL: No shared files mode enabled, IPC is disabled 00:04:35.260 EAL: Heap on socket 0 was shrunk by 18MB 00:04:35.260 EAL: Trying to obtain current memory policy. 00:04:35.260 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.260 EAL: Restoring previous memory policy: 4 00:04:35.260 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.260 EAL: request: mp_malloc_sync 00:04:35.260 EAL: No shared files mode enabled, IPC is disabled 00:04:35.260 EAL: Heap on socket 0 was expanded by 34MB 00:04:35.260 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.260 EAL: request: mp_malloc_sync 00:04:35.260 EAL: No shared files mode enabled, IPC is disabled 00:04:35.260 EAL: Heap on socket 0 was shrunk by 34MB 00:04:35.260 EAL: Trying to obtain current memory policy. 00:04:35.260 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.260 EAL: Restoring previous memory policy: 4 00:04:35.260 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.260 EAL: request: mp_malloc_sync 00:04:35.260 EAL: No shared files mode enabled, IPC is disabled 00:04:35.260 EAL: Heap on socket 0 was expanded by 66MB 00:04:35.260 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.260 EAL: request: mp_malloc_sync 00:04:35.260 EAL: No shared files mode enabled, IPC is disabled 00:04:35.260 EAL: Heap on socket 0 was shrunk by 66MB 00:04:35.260 EAL: Trying to obtain current memory policy. 00:04:35.260 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.260 EAL: Restoring previous memory policy: 4 00:04:35.260 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.260 EAL: request: mp_malloc_sync 00:04:35.260 EAL: No shared files mode enabled, IPC is disabled 00:04:35.260 EAL: Heap on socket 0 was expanded by 130MB 00:04:35.260 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.519 EAL: request: mp_malloc_sync 00:04:35.519 EAL: No shared files mode enabled, IPC is disabled 00:04:35.519 EAL: Heap on socket 0 was shrunk by 130MB 00:04:35.519 EAL: Trying to obtain current memory policy. 00:04:35.519 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.519 EAL: Restoring previous memory policy: 4 00:04:35.519 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.519 EAL: request: mp_malloc_sync 00:04:35.519 EAL: No shared files mode enabled, IPC is disabled 00:04:35.519 EAL: Heap on socket 0 was expanded by 258MB 00:04:35.519 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.519 EAL: request: mp_malloc_sync 00:04:35.519 EAL: No shared files mode enabled, IPC is disabled 00:04:35.519 EAL: Heap on socket 0 was shrunk by 258MB 00:04:35.519 EAL: Trying to obtain current memory policy. 00:04:35.519 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:35.519 EAL: Restoring previous memory policy: 4 00:04:35.519 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.519 EAL: request: mp_malloc_sync 00:04:35.519 EAL: No shared files mode enabled, IPC is disabled 00:04:35.519 EAL: Heap on socket 0 was expanded by 514MB 00:04:35.777 EAL: Calling mem event callback 'spdk:(nil)' 00:04:35.777 EAL: request: mp_malloc_sync 00:04:35.777 EAL: No shared files mode enabled, IPC is disabled 00:04:35.777 EAL: Heap on socket 0 was shrunk by 514MB 00:04:35.777 EAL: Trying to obtain current memory policy. 00:04:35.777 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:36.036 EAL: Restoring previous memory policy: 4 00:04:36.036 EAL: Calling mem event callback 'spdk:(nil)' 00:04:36.036 EAL: request: mp_malloc_sync 00:04:36.036 EAL: No shared files mode enabled, IPC is disabled 00:04:36.036 EAL: Heap on socket 0 was expanded by 1026MB 00:04:36.036 EAL: Calling mem event callback 'spdk:(nil)' 00:04:36.295 EAL: request: mp_malloc_sync 00:04:36.295 EAL: No shared files mode enabled, IPC is disabled 00:04:36.295 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:36.295 passed 00:04:36.295 00:04:36.295 Run Summary: Type Total Ran Passed Failed Inactive 00:04:36.295 suites 1 1 n/a 0 0 00:04:36.295 tests 2 2 2 0 0 00:04:36.295 asserts 497 497 497 0 n/a 00:04:36.295 00:04:36.295 Elapsed time = 0.964 seconds 00:04:36.295 EAL: Calling mem event callback 'spdk:(nil)' 00:04:36.295 EAL: request: mp_malloc_sync 00:04:36.295 EAL: No shared files mode enabled, IPC is disabled 00:04:36.295 EAL: Heap on socket 0 was shrunk by 2MB 00:04:36.295 EAL: No shared files mode enabled, IPC is disabled 00:04:36.295 EAL: No shared files mode enabled, IPC is disabled 00:04:36.295 EAL: No shared files mode enabled, IPC is disabled 00:04:36.295 00:04:36.295 real 0m1.078s 00:04:36.295 user 0m0.629s 00:04:36.295 sys 0m0.425s 00:04:36.295 11:28:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.295 11:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:36.295 ************************************ 00:04:36.295 END TEST env_vtophys 00:04:36.295 ************************************ 00:04:36.295 11:28:05 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:36.295 11:28:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:36.295 11:28:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:36.295 11:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:36.295 ************************************ 00:04:36.295 START TEST env_pci 00:04:36.295 ************************************ 00:04:36.295 11:28:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:36.295 00:04:36.295 00:04:36.295 CUnit - A unit testing framework for C - Version 2.1-3 00:04:36.295 http://cunit.sourceforge.net/ 00:04:36.295 00:04:36.295 00:04:36.295 Suite: pci 00:04:36.295 Test: pci_hook ...[2024-07-21 11:28:05.644781] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2035806 has claimed it 00:04:36.295 EAL: Cannot find device (10000:00:01.0) 00:04:36.295 EAL: Failed to attach device on primary process 00:04:36.295 passed 00:04:36.295 00:04:36.295 Run Summary: Type Total Ran Passed Failed Inactive 00:04:36.295 suites 1 1 n/a 0 0 00:04:36.295 tests 1 1 1 0 0 00:04:36.295 asserts 25 25 25 0 n/a 00:04:36.295 00:04:36.295 Elapsed time = 0.036 seconds 00:04:36.295 00:04:36.295 real 0m0.056s 00:04:36.295 user 0m0.015s 00:04:36.295 sys 0m0.041s 00:04:36.295 11:28:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.295 11:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:36.295 ************************************ 00:04:36.295 END TEST env_pci 00:04:36.295 ************************************ 00:04:36.554 11:28:05 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:36.554 11:28:05 -- env/env.sh@15 -- # uname 00:04:36.554 11:28:05 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:36.554 11:28:05 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:36.554 11:28:05 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:36.554 11:28:05 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:04:36.554 11:28:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:36.554 11:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:36.554 ************************************ 00:04:36.554 START TEST env_dpdk_post_init 00:04:36.554 ************************************ 00:04:36.554 11:28:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:36.554 EAL: Detected CPU lcores: 112 00:04:36.554 EAL: Detected NUMA nodes: 2 00:04:36.554 EAL: Detected static linkage of DPDK 00:04:36.554 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:36.554 EAL: Selected IOVA mode 'VA' 00:04:36.554 EAL: No free 2048 kB hugepages reported on node 1 00:04:36.554 EAL: VFIO support initialized 00:04:36.554 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:36.554 EAL: Using IOMMU type 1 (Type 1) 00:04:37.491 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:04:40.769 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:04:40.769 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:04:41.335 Starting DPDK initialization... 00:04:41.335 Starting SPDK post initialization... 00:04:41.335 SPDK NVMe probe 00:04:41.335 Attaching to 0000:d8:00.0 00:04:41.335 Attached to 0000:d8:00.0 00:04:41.335 Cleaning up... 00:04:41.335 00:04:41.335 real 0m4.734s 00:04:41.335 user 0m3.515s 00:04:41.335 sys 0m0.466s 00:04:41.335 11:28:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:41.335 11:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:41.335 ************************************ 00:04:41.335 END TEST env_dpdk_post_init 00:04:41.335 ************************************ 00:04:41.335 11:28:10 -- env/env.sh@26 -- # uname 00:04:41.335 11:28:10 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:41.335 11:28:10 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:41.336 11:28:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:41.336 11:28:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:41.336 11:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:41.336 ************************************ 00:04:41.336 START TEST env_mem_callbacks 00:04:41.336 ************************************ 00:04:41.336 11:28:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:41.336 EAL: Detected CPU lcores: 112 00:04:41.336 EAL: Detected NUMA nodes: 2 00:04:41.336 EAL: Detected static linkage of DPDK 00:04:41.336 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:41.336 EAL: Selected IOVA mode 'VA' 00:04:41.336 EAL: No free 2048 kB hugepages reported on node 1 00:04:41.336 EAL: VFIO support initialized 00:04:41.336 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:41.336 00:04:41.336 00:04:41.336 CUnit - A unit testing framework for C - Version 2.1-3 00:04:41.336 http://cunit.sourceforge.net/ 00:04:41.336 00:04:41.336 00:04:41.336 Suite: memory 00:04:41.336 Test: test ... 00:04:41.336 register 0x200000200000 2097152 00:04:41.336 malloc 3145728 00:04:41.336 register 0x200000400000 4194304 00:04:41.336 buf 0x200000500000 len 3145728 PASSED 00:04:41.336 malloc 64 00:04:41.336 buf 0x2000004fff40 len 64 PASSED 00:04:41.336 malloc 4194304 00:04:41.336 register 0x200000800000 6291456 00:04:41.336 buf 0x200000a00000 len 4194304 PASSED 00:04:41.336 free 0x200000500000 3145728 00:04:41.336 free 0x2000004fff40 64 00:04:41.336 unregister 0x200000400000 4194304 PASSED 00:04:41.336 free 0x200000a00000 4194304 00:04:41.336 unregister 0x200000800000 6291456 PASSED 00:04:41.336 malloc 8388608 00:04:41.336 register 0x200000400000 10485760 00:04:41.336 buf 0x200000600000 len 8388608 PASSED 00:04:41.336 free 0x200000600000 8388608 00:04:41.336 unregister 0x200000400000 10485760 PASSED 00:04:41.336 passed 00:04:41.336 00:04:41.336 Run Summary: Type Total Ran Passed Failed Inactive 00:04:41.336 suites 1 1 n/a 0 0 00:04:41.336 tests 1 1 1 0 0 00:04:41.336 asserts 15 15 15 0 n/a 00:04:41.336 00:04:41.336 Elapsed time = 0.005 seconds 00:04:41.336 00:04:41.336 real 0m0.065s 00:04:41.336 user 0m0.022s 00:04:41.336 sys 0m0.043s 00:04:41.336 11:28:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:41.336 11:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:41.336 ************************************ 00:04:41.336 END TEST env_mem_callbacks 00:04:41.336 ************************************ 00:04:41.336 00:04:41.336 real 0m6.403s 00:04:41.336 user 0m4.408s 00:04:41.336 sys 0m1.272s 00:04:41.336 11:28:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:41.336 11:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:41.336 ************************************ 00:04:41.336 END TEST env 00:04:41.336 ************************************ 00:04:41.336 11:28:10 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:41.336 11:28:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:41.336 11:28:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:41.336 11:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:41.336 ************************************ 00:04:41.336 START TEST rpc 00:04:41.336 ************************************ 00:04:41.336 11:28:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:41.595 * Looking for test storage... 00:04:41.595 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:41.595 11:28:10 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:04:41.595 11:28:10 -- rpc/rpc.sh@65 -- # spdk_pid=2036899 00:04:41.595 11:28:10 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:41.595 11:28:10 -- rpc/rpc.sh@67 -- # waitforlisten 2036899 00:04:41.595 11:28:10 -- common/autotest_common.sh@819 -- # '[' -z 2036899 ']' 00:04:41.595 11:28:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:41.595 11:28:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:41.595 11:28:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:41.595 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:41.595 11:28:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:41.595 11:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:41.595 [2024-07-21 11:28:10.792547] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:41.595 [2024-07-21 11:28:10.792620] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2036899 ] 00:04:41.595 EAL: No free 2048 kB hugepages reported on node 1 00:04:41.595 [2024-07-21 11:28:10.858994] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:41.595 [2024-07-21 11:28:10.895891] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:41.595 [2024-07-21 11:28:10.896008] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:04:41.595 [2024-07-21 11:28:10.896021] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2036899' to capture a snapshot of events at runtime. 00:04:41.595 [2024-07-21 11:28:10.896030] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2036899 for offline analysis/debug. 00:04:41.595 [2024-07-21 11:28:10.896050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:42.531 11:28:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:42.531 11:28:11 -- common/autotest_common.sh@852 -- # return 0 00:04:42.531 11:28:11 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:42.531 11:28:11 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:42.531 11:28:11 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:04:42.531 11:28:11 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:04:42.531 11:28:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:42.531 11:28:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:42.531 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.531 ************************************ 00:04:42.531 START TEST rpc_integrity 00:04:42.531 ************************************ 00:04:42.531 11:28:11 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:04:42.531 11:28:11 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:42.531 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.531 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.531 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.531 11:28:11 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:42.531 11:28:11 -- rpc/rpc.sh@13 -- # jq length 00:04:42.531 11:28:11 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:42.531 11:28:11 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:42.531 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.531 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.531 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.531 11:28:11 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:04:42.531 11:28:11 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:42.531 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.531 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.531 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.531 11:28:11 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:42.531 { 00:04:42.531 "name": "Malloc0", 00:04:42.531 "aliases": [ 00:04:42.531 "1f8d6637-9b44-4d35-92a8-b262b20cbaa5" 00:04:42.531 ], 00:04:42.531 "product_name": "Malloc disk", 00:04:42.531 "block_size": 512, 00:04:42.531 "num_blocks": 16384, 00:04:42.531 "uuid": "1f8d6637-9b44-4d35-92a8-b262b20cbaa5", 00:04:42.531 "assigned_rate_limits": { 00:04:42.531 "rw_ios_per_sec": 0, 00:04:42.531 "rw_mbytes_per_sec": 0, 00:04:42.531 "r_mbytes_per_sec": 0, 00:04:42.531 "w_mbytes_per_sec": 0 00:04:42.531 }, 00:04:42.531 "claimed": false, 00:04:42.531 "zoned": false, 00:04:42.531 "supported_io_types": { 00:04:42.531 "read": true, 00:04:42.531 "write": true, 00:04:42.531 "unmap": true, 00:04:42.531 "write_zeroes": true, 00:04:42.531 "flush": true, 00:04:42.531 "reset": true, 00:04:42.531 "compare": false, 00:04:42.531 "compare_and_write": false, 00:04:42.531 "abort": true, 00:04:42.531 "nvme_admin": false, 00:04:42.531 "nvme_io": false 00:04:42.531 }, 00:04:42.531 "memory_domains": [ 00:04:42.531 { 00:04:42.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:42.531 "dma_device_type": 2 00:04:42.531 } 00:04:42.531 ], 00:04:42.531 "driver_specific": {} 00:04:42.531 } 00:04:42.531 ]' 00:04:42.531 11:28:11 -- rpc/rpc.sh@17 -- # jq length 00:04:42.531 11:28:11 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:42.531 11:28:11 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:04:42.531 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.531 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.531 [2024-07-21 11:28:11.723346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:04:42.531 [2024-07-21 11:28:11.723382] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:42.531 [2024-07-21 11:28:11.723399] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4dcb490 00:04:42.531 [2024-07-21 11:28:11.723408] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:42.531 [2024-07-21 11:28:11.724249] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:42.531 [2024-07-21 11:28:11.724273] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:42.531 Passthru0 00:04:42.531 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.531 11:28:11 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:42.531 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.531 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.531 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.531 11:28:11 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:42.531 { 00:04:42.531 "name": "Malloc0", 00:04:42.531 "aliases": [ 00:04:42.531 "1f8d6637-9b44-4d35-92a8-b262b20cbaa5" 00:04:42.531 ], 00:04:42.531 "product_name": "Malloc disk", 00:04:42.531 "block_size": 512, 00:04:42.531 "num_blocks": 16384, 00:04:42.531 "uuid": "1f8d6637-9b44-4d35-92a8-b262b20cbaa5", 00:04:42.531 "assigned_rate_limits": { 00:04:42.531 "rw_ios_per_sec": 0, 00:04:42.531 "rw_mbytes_per_sec": 0, 00:04:42.531 "r_mbytes_per_sec": 0, 00:04:42.531 "w_mbytes_per_sec": 0 00:04:42.531 }, 00:04:42.531 "claimed": true, 00:04:42.531 "claim_type": "exclusive_write", 00:04:42.531 "zoned": false, 00:04:42.531 "supported_io_types": { 00:04:42.531 "read": true, 00:04:42.531 "write": true, 00:04:42.531 "unmap": true, 00:04:42.531 "write_zeroes": true, 00:04:42.531 "flush": true, 00:04:42.531 "reset": true, 00:04:42.531 "compare": false, 00:04:42.531 "compare_and_write": false, 00:04:42.531 "abort": true, 00:04:42.531 "nvme_admin": false, 00:04:42.531 "nvme_io": false 00:04:42.531 }, 00:04:42.531 "memory_domains": [ 00:04:42.531 { 00:04:42.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:42.531 "dma_device_type": 2 00:04:42.531 } 00:04:42.531 ], 00:04:42.531 "driver_specific": {} 00:04:42.531 }, 00:04:42.531 { 00:04:42.531 "name": "Passthru0", 00:04:42.531 "aliases": [ 00:04:42.532 "4e482b8c-6b82-56b8-b914-3ab37c4204f1" 00:04:42.532 ], 00:04:42.532 "product_name": "passthru", 00:04:42.532 "block_size": 512, 00:04:42.532 "num_blocks": 16384, 00:04:42.532 "uuid": "4e482b8c-6b82-56b8-b914-3ab37c4204f1", 00:04:42.532 "assigned_rate_limits": { 00:04:42.532 "rw_ios_per_sec": 0, 00:04:42.532 "rw_mbytes_per_sec": 0, 00:04:42.532 "r_mbytes_per_sec": 0, 00:04:42.532 "w_mbytes_per_sec": 0 00:04:42.532 }, 00:04:42.532 "claimed": false, 00:04:42.532 "zoned": false, 00:04:42.532 "supported_io_types": { 00:04:42.532 "read": true, 00:04:42.532 "write": true, 00:04:42.532 "unmap": true, 00:04:42.532 "write_zeroes": true, 00:04:42.532 "flush": true, 00:04:42.532 "reset": true, 00:04:42.532 "compare": false, 00:04:42.532 "compare_and_write": false, 00:04:42.532 "abort": true, 00:04:42.532 "nvme_admin": false, 00:04:42.532 "nvme_io": false 00:04:42.532 }, 00:04:42.532 "memory_domains": [ 00:04:42.532 { 00:04:42.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:42.532 "dma_device_type": 2 00:04:42.532 } 00:04:42.532 ], 00:04:42.532 "driver_specific": { 00:04:42.532 "passthru": { 00:04:42.532 "name": "Passthru0", 00:04:42.532 "base_bdev_name": "Malloc0" 00:04:42.532 } 00:04:42.532 } 00:04:42.532 } 00:04:42.532 ]' 00:04:42.532 11:28:11 -- rpc/rpc.sh@21 -- # jq length 00:04:42.532 11:28:11 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:42.532 11:28:11 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:42.532 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.532 11:28:11 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:04:42.532 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.532 11:28:11 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:42.532 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.532 11:28:11 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:42.532 11:28:11 -- rpc/rpc.sh@26 -- # jq length 00:04:42.532 11:28:11 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:42.532 00:04:42.532 real 0m0.254s 00:04:42.532 user 0m0.142s 00:04:42.532 sys 0m0.050s 00:04:42.532 11:28:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 ************************************ 00:04:42.532 END TEST rpc_integrity 00:04:42.532 ************************************ 00:04:42.532 11:28:11 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:04:42.532 11:28:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:42.532 11:28:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 ************************************ 00:04:42.532 START TEST rpc_plugins 00:04:42.532 ************************************ 00:04:42.532 11:28:11 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:04:42.532 11:28:11 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:04:42.532 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.532 11:28:11 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:04:42.532 11:28:11 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:04:42.532 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.532 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.532 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.532 11:28:11 -- rpc/rpc.sh@31 -- # bdevs='[ 00:04:42.532 { 00:04:42.532 "name": "Malloc1", 00:04:42.532 "aliases": [ 00:04:42.532 "ff9dfb8c-9861-4564-ba35-c2c27eb418f9" 00:04:42.532 ], 00:04:42.532 "product_name": "Malloc disk", 00:04:42.532 "block_size": 4096, 00:04:42.532 "num_blocks": 256, 00:04:42.532 "uuid": "ff9dfb8c-9861-4564-ba35-c2c27eb418f9", 00:04:42.532 "assigned_rate_limits": { 00:04:42.532 "rw_ios_per_sec": 0, 00:04:42.532 "rw_mbytes_per_sec": 0, 00:04:42.532 "r_mbytes_per_sec": 0, 00:04:42.532 "w_mbytes_per_sec": 0 00:04:42.532 }, 00:04:42.532 "claimed": false, 00:04:42.532 "zoned": false, 00:04:42.532 "supported_io_types": { 00:04:42.532 "read": true, 00:04:42.532 "write": true, 00:04:42.532 "unmap": true, 00:04:42.532 "write_zeroes": true, 00:04:42.532 "flush": true, 00:04:42.532 "reset": true, 00:04:42.532 "compare": false, 00:04:42.532 "compare_and_write": false, 00:04:42.532 "abort": true, 00:04:42.532 "nvme_admin": false, 00:04:42.532 "nvme_io": false 00:04:42.532 }, 00:04:42.532 "memory_domains": [ 00:04:42.532 { 00:04:42.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:42.532 "dma_device_type": 2 00:04:42.532 } 00:04:42.532 ], 00:04:42.532 "driver_specific": {} 00:04:42.532 } 00:04:42.532 ]' 00:04:42.532 11:28:11 -- rpc/rpc.sh@32 -- # jq length 00:04:42.791 11:28:11 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:04:42.791 11:28:11 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:04:42.791 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.791 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.791 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.791 11:28:11 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:04:42.791 11:28:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.791 11:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:42.791 11:28:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.791 11:28:12 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:04:42.791 11:28:12 -- rpc/rpc.sh@36 -- # jq length 00:04:42.791 11:28:12 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:04:42.791 00:04:42.791 real 0m0.131s 00:04:42.791 user 0m0.078s 00:04:42.791 sys 0m0.019s 00:04:42.791 11:28:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:42.791 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:42.791 ************************************ 00:04:42.791 END TEST rpc_plugins 00:04:42.791 ************************************ 00:04:42.791 11:28:12 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:04:42.791 11:28:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:42.791 11:28:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:42.791 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:42.791 ************************************ 00:04:42.791 START TEST rpc_trace_cmd_test 00:04:42.791 ************************************ 00:04:42.791 11:28:12 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:04:42.791 11:28:12 -- rpc/rpc.sh@40 -- # local info 00:04:42.791 11:28:12 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:04:42.791 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:42.791 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:42.791 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:42.791 11:28:12 -- rpc/rpc.sh@42 -- # info='{ 00:04:42.791 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2036899", 00:04:42.791 "tpoint_group_mask": "0x8", 00:04:42.791 "iscsi_conn": { 00:04:42.791 "mask": "0x2", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "scsi": { 00:04:42.791 "mask": "0x4", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "bdev": { 00:04:42.791 "mask": "0x8", 00:04:42.791 "tpoint_mask": "0xffffffffffffffff" 00:04:42.791 }, 00:04:42.791 "nvmf_rdma": { 00:04:42.791 "mask": "0x10", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "nvmf_tcp": { 00:04:42.791 "mask": "0x20", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "ftl": { 00:04:42.791 "mask": "0x40", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "blobfs": { 00:04:42.791 "mask": "0x80", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "dsa": { 00:04:42.791 "mask": "0x200", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "thread": { 00:04:42.791 "mask": "0x400", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "nvme_pcie": { 00:04:42.791 "mask": "0x800", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "iaa": { 00:04:42.791 "mask": "0x1000", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "nvme_tcp": { 00:04:42.791 "mask": "0x2000", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 }, 00:04:42.791 "bdev_nvme": { 00:04:42.791 "mask": "0x4000", 00:04:42.791 "tpoint_mask": "0x0" 00:04:42.791 } 00:04:42.791 }' 00:04:42.791 11:28:12 -- rpc/rpc.sh@43 -- # jq length 00:04:42.791 11:28:12 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:04:42.791 11:28:12 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:04:42.791 11:28:12 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:04:42.791 11:28:12 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:04:42.791 11:28:12 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:04:42.791 11:28:12 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:04:42.791 11:28:12 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:04:43.049 11:28:12 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:04:43.049 11:28:12 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:04:43.049 00:04:43.049 real 0m0.182s 00:04:43.049 user 0m0.147s 00:04:43.049 sys 0m0.029s 00:04:43.049 11:28:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.049 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.049 ************************************ 00:04:43.049 END TEST rpc_trace_cmd_test 00:04:43.049 ************************************ 00:04:43.049 11:28:12 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:04:43.049 11:28:12 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:04:43.049 11:28:12 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:04:43.049 11:28:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:43.049 11:28:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:43.050 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.050 ************************************ 00:04:43.050 START TEST rpc_daemon_integrity 00:04:43.050 ************************************ 00:04:43.050 11:28:12 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:04:43.050 11:28:12 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:43.050 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.050 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.050 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.050 11:28:12 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:43.050 11:28:12 -- rpc/rpc.sh@13 -- # jq length 00:04:43.050 11:28:12 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:43.050 11:28:12 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:43.050 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.050 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.050 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.050 11:28:12 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:04:43.050 11:28:12 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:43.050 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.050 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.050 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.050 11:28:12 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:43.050 { 00:04:43.050 "name": "Malloc2", 00:04:43.050 "aliases": [ 00:04:43.050 "bd92ca7a-cafc-4caa-ac8c-6b8f30d5b3d6" 00:04:43.050 ], 00:04:43.050 "product_name": "Malloc disk", 00:04:43.050 "block_size": 512, 00:04:43.050 "num_blocks": 16384, 00:04:43.050 "uuid": "bd92ca7a-cafc-4caa-ac8c-6b8f30d5b3d6", 00:04:43.050 "assigned_rate_limits": { 00:04:43.050 "rw_ios_per_sec": 0, 00:04:43.050 "rw_mbytes_per_sec": 0, 00:04:43.050 "r_mbytes_per_sec": 0, 00:04:43.050 "w_mbytes_per_sec": 0 00:04:43.050 }, 00:04:43.050 "claimed": false, 00:04:43.050 "zoned": false, 00:04:43.050 "supported_io_types": { 00:04:43.050 "read": true, 00:04:43.050 "write": true, 00:04:43.050 "unmap": true, 00:04:43.050 "write_zeroes": true, 00:04:43.050 "flush": true, 00:04:43.050 "reset": true, 00:04:43.050 "compare": false, 00:04:43.050 "compare_and_write": false, 00:04:43.050 "abort": true, 00:04:43.050 "nvme_admin": false, 00:04:43.050 "nvme_io": false 00:04:43.050 }, 00:04:43.050 "memory_domains": [ 00:04:43.050 { 00:04:43.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:43.050 "dma_device_type": 2 00:04:43.050 } 00:04:43.050 ], 00:04:43.050 "driver_specific": {} 00:04:43.050 } 00:04:43.050 ]' 00:04:43.050 11:28:12 -- rpc/rpc.sh@17 -- # jq length 00:04:43.050 11:28:12 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:43.050 11:28:12 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:04:43.050 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.050 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.050 [2024-07-21 11:28:12.437206] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:04:43.050 [2024-07-21 11:28:12.437237] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:43.050 [2024-07-21 11:28:12.437254] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4dcaed0 00:04:43.050 [2024-07-21 11:28:12.437263] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:43.050 [2024-07-21 11:28:12.437950] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:43.050 [2024-07-21 11:28:12.437973] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:43.050 Passthru0 00:04:43.050 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.050 11:28:12 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:43.050 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.050 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.310 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.311 11:28:12 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:43.311 { 00:04:43.311 "name": "Malloc2", 00:04:43.311 "aliases": [ 00:04:43.311 "bd92ca7a-cafc-4caa-ac8c-6b8f30d5b3d6" 00:04:43.311 ], 00:04:43.311 "product_name": "Malloc disk", 00:04:43.311 "block_size": 512, 00:04:43.311 "num_blocks": 16384, 00:04:43.311 "uuid": "bd92ca7a-cafc-4caa-ac8c-6b8f30d5b3d6", 00:04:43.311 "assigned_rate_limits": { 00:04:43.311 "rw_ios_per_sec": 0, 00:04:43.311 "rw_mbytes_per_sec": 0, 00:04:43.311 "r_mbytes_per_sec": 0, 00:04:43.311 "w_mbytes_per_sec": 0 00:04:43.311 }, 00:04:43.311 "claimed": true, 00:04:43.311 "claim_type": "exclusive_write", 00:04:43.311 "zoned": false, 00:04:43.311 "supported_io_types": { 00:04:43.311 "read": true, 00:04:43.311 "write": true, 00:04:43.311 "unmap": true, 00:04:43.311 "write_zeroes": true, 00:04:43.311 "flush": true, 00:04:43.311 "reset": true, 00:04:43.311 "compare": false, 00:04:43.311 "compare_and_write": false, 00:04:43.311 "abort": true, 00:04:43.311 "nvme_admin": false, 00:04:43.311 "nvme_io": false 00:04:43.311 }, 00:04:43.311 "memory_domains": [ 00:04:43.311 { 00:04:43.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:43.311 "dma_device_type": 2 00:04:43.311 } 00:04:43.311 ], 00:04:43.311 "driver_specific": {} 00:04:43.311 }, 00:04:43.311 { 00:04:43.311 "name": "Passthru0", 00:04:43.311 "aliases": [ 00:04:43.311 "b82cef23-5d5d-5c53-94c1-aeb5c5154153" 00:04:43.311 ], 00:04:43.311 "product_name": "passthru", 00:04:43.311 "block_size": 512, 00:04:43.311 "num_blocks": 16384, 00:04:43.311 "uuid": "b82cef23-5d5d-5c53-94c1-aeb5c5154153", 00:04:43.311 "assigned_rate_limits": { 00:04:43.311 "rw_ios_per_sec": 0, 00:04:43.311 "rw_mbytes_per_sec": 0, 00:04:43.311 "r_mbytes_per_sec": 0, 00:04:43.311 "w_mbytes_per_sec": 0 00:04:43.311 }, 00:04:43.311 "claimed": false, 00:04:43.311 "zoned": false, 00:04:43.311 "supported_io_types": { 00:04:43.311 "read": true, 00:04:43.311 "write": true, 00:04:43.311 "unmap": true, 00:04:43.311 "write_zeroes": true, 00:04:43.311 "flush": true, 00:04:43.311 "reset": true, 00:04:43.311 "compare": false, 00:04:43.311 "compare_and_write": false, 00:04:43.311 "abort": true, 00:04:43.311 "nvme_admin": false, 00:04:43.311 "nvme_io": false 00:04:43.311 }, 00:04:43.311 "memory_domains": [ 00:04:43.311 { 00:04:43.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:43.311 "dma_device_type": 2 00:04:43.311 } 00:04:43.311 ], 00:04:43.311 "driver_specific": { 00:04:43.311 "passthru": { 00:04:43.311 "name": "Passthru0", 00:04:43.311 "base_bdev_name": "Malloc2" 00:04:43.311 } 00:04:43.311 } 00:04:43.311 } 00:04:43.311 ]' 00:04:43.311 11:28:12 -- rpc/rpc.sh@21 -- # jq length 00:04:43.311 11:28:12 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:43.311 11:28:12 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:43.311 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.311 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.311 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.311 11:28:12 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:04:43.311 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.311 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.311 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.311 11:28:12 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:43.311 11:28:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:43.311 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.311 11:28:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:43.311 11:28:12 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:43.311 11:28:12 -- rpc/rpc.sh@26 -- # jq length 00:04:43.311 11:28:12 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:43.311 00:04:43.311 real 0m0.282s 00:04:43.311 user 0m0.182s 00:04:43.311 sys 0m0.038s 00:04:43.311 11:28:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.311 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.311 ************************************ 00:04:43.311 END TEST rpc_daemon_integrity 00:04:43.311 ************************************ 00:04:43.311 11:28:12 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:04:43.311 11:28:12 -- rpc/rpc.sh@84 -- # killprocess 2036899 00:04:43.311 11:28:12 -- common/autotest_common.sh@926 -- # '[' -z 2036899 ']' 00:04:43.311 11:28:12 -- common/autotest_common.sh@930 -- # kill -0 2036899 00:04:43.311 11:28:12 -- common/autotest_common.sh@931 -- # uname 00:04:43.311 11:28:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:43.311 11:28:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2036899 00:04:43.311 11:28:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:43.311 11:28:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:43.311 11:28:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2036899' 00:04:43.311 killing process with pid 2036899 00:04:43.311 11:28:12 -- common/autotest_common.sh@945 -- # kill 2036899 00:04:43.311 11:28:12 -- common/autotest_common.sh@950 -- # wait 2036899 00:04:43.570 00:04:43.570 real 0m2.288s 00:04:43.570 user 0m2.846s 00:04:43.570 sys 0m0.691s 00:04:43.570 11:28:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.570 11:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:43.570 ************************************ 00:04:43.570 END TEST rpc 00:04:43.570 ************************************ 00:04:43.829 11:28:13 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:43.829 11:28:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:43.829 11:28:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:43.829 11:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:43.829 ************************************ 00:04:43.829 START TEST rpc_client 00:04:43.829 ************************************ 00:04:43.829 11:28:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:43.829 * Looking for test storage... 00:04:43.829 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:04:43.829 11:28:13 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:04:43.829 OK 00:04:43.829 11:28:13 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:04:43.829 00:04:43.829 real 0m0.112s 00:04:43.829 user 0m0.053s 00:04:43.829 sys 0m0.068s 00:04:43.829 11:28:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.829 11:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:43.829 ************************************ 00:04:43.829 END TEST rpc_client 00:04:43.829 ************************************ 00:04:43.829 11:28:13 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:04:43.829 11:28:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:43.829 11:28:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:43.829 11:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:43.829 ************************************ 00:04:43.829 START TEST json_config 00:04:43.829 ************************************ 00:04:43.829 11:28:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:04:43.829 11:28:13 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:43.829 11:28:13 -- nvmf/common.sh@7 -- # uname -s 00:04:44.088 11:28:13 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:44.088 11:28:13 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:44.088 11:28:13 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:44.088 11:28:13 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:44.088 11:28:13 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:44.088 11:28:13 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:44.088 11:28:13 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:44.088 11:28:13 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:44.088 11:28:13 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:44.088 11:28:13 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:44.088 11:28:13 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:44.088 11:28:13 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:44.088 11:28:13 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:44.088 11:28:13 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:44.088 11:28:13 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:44.088 11:28:13 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:44.088 11:28:13 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:44.088 11:28:13 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:44.088 11:28:13 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:44.088 11:28:13 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.088 11:28:13 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.088 11:28:13 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.088 11:28:13 -- paths/export.sh@5 -- # export PATH 00:04:44.088 11:28:13 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.088 11:28:13 -- nvmf/common.sh@46 -- # : 0 00:04:44.088 11:28:13 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:44.088 11:28:13 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:44.088 11:28:13 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:44.088 11:28:13 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:44.088 11:28:13 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:44.088 11:28:13 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:44.088 11:28:13 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:44.088 11:28:13 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:44.088 11:28:13 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:04:44.088 11:28:13 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:04:44.088 11:28:13 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:04:44.088 11:28:13 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:04:44.088 11:28:13 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:04:44.088 WARNING: No tests are enabled so not running JSON configuration tests 00:04:44.088 11:28:13 -- json_config/json_config.sh@27 -- # exit 0 00:04:44.088 00:04:44.088 real 0m0.104s 00:04:44.088 user 0m0.043s 00:04:44.088 sys 0m0.062s 00:04:44.088 11:28:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:44.088 11:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:44.088 ************************************ 00:04:44.088 END TEST json_config 00:04:44.088 ************************************ 00:04:44.088 11:28:13 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:04:44.088 11:28:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:44.088 11:28:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:44.088 11:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:44.088 ************************************ 00:04:44.088 START TEST json_config_extra_key 00:04:44.088 ************************************ 00:04:44.088 11:28:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:04:44.088 11:28:13 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:44.089 11:28:13 -- nvmf/common.sh@7 -- # uname -s 00:04:44.089 11:28:13 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:44.089 11:28:13 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:44.089 11:28:13 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:44.089 11:28:13 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:44.089 11:28:13 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:44.089 11:28:13 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:44.089 11:28:13 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:44.089 11:28:13 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:44.089 11:28:13 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:44.089 11:28:13 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:44.089 11:28:13 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:44.089 11:28:13 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:44.089 11:28:13 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:44.089 11:28:13 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:44.089 11:28:13 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:44.089 11:28:13 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:44.089 11:28:13 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:44.089 11:28:13 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:44.089 11:28:13 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:44.089 11:28:13 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.089 11:28:13 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.089 11:28:13 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.089 11:28:13 -- paths/export.sh@5 -- # export PATH 00:04:44.089 11:28:13 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:44.089 11:28:13 -- nvmf/common.sh@46 -- # : 0 00:04:44.089 11:28:13 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:44.089 11:28:13 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:44.089 11:28:13 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:44.089 11:28:13 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:44.089 11:28:13 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:44.089 11:28:13 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:44.089 11:28:13 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:44.089 11:28:13 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:04:44.089 INFO: launching applications... 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@25 -- # shift 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=2037532 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:04:44.089 Waiting for target to run... 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 2037532 /var/tmp/spdk_tgt.sock 00:04:44.089 11:28:13 -- common/autotest_common.sh@819 -- # '[' -z 2037532 ']' 00:04:44.089 11:28:13 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:04:44.089 11:28:13 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:04:44.089 11:28:13 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:44.089 11:28:13 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:04:44.089 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:04:44.089 11:28:13 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:44.089 11:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:44.089 [2024-07-21 11:28:13.436489] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:44.089 [2024-07-21 11:28:13.436547] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2037532 ] 00:04:44.089 EAL: No free 2048 kB hugepages reported on node 1 00:04:44.348 [2024-07-21 11:28:13.717603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:44.348 [2024-07-21 11:28:13.737906] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:44.348 [2024-07-21 11:28:13.738007] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:44.915 11:28:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:44.915 11:28:14 -- common/autotest_common.sh@852 -- # return 0 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:04:44.915 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:04:44.915 INFO: shutting down applications... 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 2037532 ]] 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 2037532 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2037532 00:04:44.915 11:28:14 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2037532 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@52 -- # break 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:04:45.484 SPDK target shutdown done 00:04:45.484 11:28:14 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:04:45.484 Success 00:04:45.484 00:04:45.484 real 0m1.448s 00:04:45.484 user 0m1.185s 00:04:45.484 sys 0m0.382s 00:04:45.484 11:28:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:45.484 11:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:45.484 ************************************ 00:04:45.484 END TEST json_config_extra_key 00:04:45.484 ************************************ 00:04:45.484 11:28:14 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:04:45.484 11:28:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:45.484 11:28:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:45.484 11:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:45.484 ************************************ 00:04:45.484 START TEST alias_rpc 00:04:45.484 ************************************ 00:04:45.484 11:28:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:04:45.743 * Looking for test storage... 00:04:45.743 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:04:45.743 11:28:14 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:04:45.743 11:28:14 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2037832 00:04:45.743 11:28:14 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2037832 00:04:45.743 11:28:14 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:45.743 11:28:14 -- common/autotest_common.sh@819 -- # '[' -z 2037832 ']' 00:04:45.743 11:28:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:45.743 11:28:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:45.743 11:28:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:45.743 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:45.743 11:28:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:45.743 11:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:45.743 [2024-07-21 11:28:14.946054] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:45.743 [2024-07-21 11:28:14.946148] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2037832 ] 00:04:45.743 EAL: No free 2048 kB hugepages reported on node 1 00:04:45.743 [2024-07-21 11:28:15.016358] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:45.743 [2024-07-21 11:28:15.052269] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:45.743 [2024-07-21 11:28:15.052394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:46.676 11:28:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:46.676 11:28:15 -- common/autotest_common.sh@852 -- # return 0 00:04:46.676 11:28:15 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:04:46.676 11:28:15 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2037832 00:04:46.676 11:28:15 -- common/autotest_common.sh@926 -- # '[' -z 2037832 ']' 00:04:46.676 11:28:15 -- common/autotest_common.sh@930 -- # kill -0 2037832 00:04:46.676 11:28:15 -- common/autotest_common.sh@931 -- # uname 00:04:46.676 11:28:15 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:46.676 11:28:15 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2037832 00:04:46.676 11:28:15 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:46.676 11:28:15 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:46.676 11:28:15 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2037832' 00:04:46.676 killing process with pid 2037832 00:04:46.676 11:28:15 -- common/autotest_common.sh@945 -- # kill 2037832 00:04:46.676 11:28:15 -- common/autotest_common.sh@950 -- # wait 2037832 00:04:46.934 00:04:46.934 real 0m1.465s 00:04:46.934 user 0m1.533s 00:04:46.934 sys 0m0.451s 00:04:46.934 11:28:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:46.934 11:28:16 -- common/autotest_common.sh@10 -- # set +x 00:04:46.934 ************************************ 00:04:46.934 END TEST alias_rpc 00:04:46.934 ************************************ 00:04:46.934 11:28:16 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:04:46.934 11:28:16 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:04:46.934 11:28:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:46.934 11:28:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:46.934 11:28:16 -- common/autotest_common.sh@10 -- # set +x 00:04:46.934 ************************************ 00:04:46.934 START TEST spdkcli_tcp 00:04:46.934 ************************************ 00:04:46.934 11:28:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:04:47.193 * Looking for test storage... 00:04:47.193 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:04:47.193 11:28:16 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:04:47.193 11:28:16 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:04:47.193 11:28:16 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:47.193 11:28:16 -- common/autotest_common.sh@10 -- # set +x 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2038153 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@27 -- # waitforlisten 2038153 00:04:47.193 11:28:16 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:04:47.193 11:28:16 -- common/autotest_common.sh@819 -- # '[' -z 2038153 ']' 00:04:47.193 11:28:16 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:47.193 11:28:16 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:47.193 11:28:16 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:47.193 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:47.193 11:28:16 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:47.193 11:28:16 -- common/autotest_common.sh@10 -- # set +x 00:04:47.193 [2024-07-21 11:28:16.461217] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:47.193 [2024-07-21 11:28:16.461307] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2038153 ] 00:04:47.193 EAL: No free 2048 kB hugepages reported on node 1 00:04:47.193 [2024-07-21 11:28:16.531302] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:47.193 [2024-07-21 11:28:16.568469] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:47.193 [2024-07-21 11:28:16.568665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:47.193 [2024-07-21 11:28:16.568668] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:48.128 11:28:17 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:48.128 11:28:17 -- common/autotest_common.sh@852 -- # return 0 00:04:48.128 11:28:17 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:04:48.128 11:28:17 -- spdkcli/tcp.sh@31 -- # socat_pid=2038329 00:04:48.128 11:28:17 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:04:48.128 [ 00:04:48.128 "spdk_get_version", 00:04:48.128 "rpc_get_methods", 00:04:48.128 "trace_get_info", 00:04:48.128 "trace_get_tpoint_group_mask", 00:04:48.128 "trace_disable_tpoint_group", 00:04:48.128 "trace_enable_tpoint_group", 00:04:48.128 "trace_clear_tpoint_mask", 00:04:48.128 "trace_set_tpoint_mask", 00:04:48.128 "vfu_tgt_set_base_path", 00:04:48.128 "framework_get_pci_devices", 00:04:48.128 "framework_get_config", 00:04:48.128 "framework_get_subsystems", 00:04:48.128 "iobuf_get_stats", 00:04:48.128 "iobuf_set_options", 00:04:48.128 "sock_set_default_impl", 00:04:48.128 "sock_impl_set_options", 00:04:48.128 "sock_impl_get_options", 00:04:48.128 "vmd_rescan", 00:04:48.128 "vmd_remove_device", 00:04:48.128 "vmd_enable", 00:04:48.128 "accel_get_stats", 00:04:48.128 "accel_set_options", 00:04:48.128 "accel_set_driver", 00:04:48.128 "accel_crypto_key_destroy", 00:04:48.128 "accel_crypto_keys_get", 00:04:48.128 "accel_crypto_key_create", 00:04:48.128 "accel_assign_opc", 00:04:48.128 "accel_get_module_info", 00:04:48.128 "accel_get_opc_assignments", 00:04:48.128 "notify_get_notifications", 00:04:48.128 "notify_get_types", 00:04:48.128 "bdev_get_histogram", 00:04:48.128 "bdev_enable_histogram", 00:04:48.128 "bdev_set_qos_limit", 00:04:48.128 "bdev_set_qd_sampling_period", 00:04:48.128 "bdev_get_bdevs", 00:04:48.128 "bdev_reset_iostat", 00:04:48.128 "bdev_get_iostat", 00:04:48.128 "bdev_examine", 00:04:48.128 "bdev_wait_for_examine", 00:04:48.128 "bdev_set_options", 00:04:48.128 "scsi_get_devices", 00:04:48.128 "thread_set_cpumask", 00:04:48.128 "framework_get_scheduler", 00:04:48.128 "framework_set_scheduler", 00:04:48.128 "framework_get_reactors", 00:04:48.128 "thread_get_io_channels", 00:04:48.128 "thread_get_pollers", 00:04:48.128 "thread_get_stats", 00:04:48.128 "framework_monitor_context_switch", 00:04:48.128 "spdk_kill_instance", 00:04:48.128 "log_enable_timestamps", 00:04:48.128 "log_get_flags", 00:04:48.128 "log_clear_flag", 00:04:48.128 "log_set_flag", 00:04:48.128 "log_get_level", 00:04:48.128 "log_set_level", 00:04:48.128 "log_get_print_level", 00:04:48.128 "log_set_print_level", 00:04:48.128 "framework_enable_cpumask_locks", 00:04:48.128 "framework_disable_cpumask_locks", 00:04:48.128 "framework_wait_init", 00:04:48.128 "framework_start_init", 00:04:48.128 "virtio_blk_create_transport", 00:04:48.128 "virtio_blk_get_transports", 00:04:48.128 "vhost_controller_set_coalescing", 00:04:48.128 "vhost_get_controllers", 00:04:48.128 "vhost_delete_controller", 00:04:48.128 "vhost_create_blk_controller", 00:04:48.128 "vhost_scsi_controller_remove_target", 00:04:48.128 "vhost_scsi_controller_add_target", 00:04:48.128 "vhost_start_scsi_controller", 00:04:48.128 "vhost_create_scsi_controller", 00:04:48.128 "ublk_recover_disk", 00:04:48.128 "ublk_get_disks", 00:04:48.128 "ublk_stop_disk", 00:04:48.128 "ublk_start_disk", 00:04:48.128 "ublk_destroy_target", 00:04:48.128 "ublk_create_target", 00:04:48.128 "nbd_get_disks", 00:04:48.128 "nbd_stop_disk", 00:04:48.128 "nbd_start_disk", 00:04:48.128 "env_dpdk_get_mem_stats", 00:04:48.128 "nvmf_subsystem_get_listeners", 00:04:48.128 "nvmf_subsystem_get_qpairs", 00:04:48.128 "nvmf_subsystem_get_controllers", 00:04:48.128 "nvmf_get_stats", 00:04:48.128 "nvmf_get_transports", 00:04:48.128 "nvmf_create_transport", 00:04:48.128 "nvmf_get_targets", 00:04:48.128 "nvmf_delete_target", 00:04:48.128 "nvmf_create_target", 00:04:48.128 "nvmf_subsystem_allow_any_host", 00:04:48.128 "nvmf_subsystem_remove_host", 00:04:48.128 "nvmf_subsystem_add_host", 00:04:48.128 "nvmf_subsystem_remove_ns", 00:04:48.128 "nvmf_subsystem_add_ns", 00:04:48.129 "nvmf_subsystem_listener_set_ana_state", 00:04:48.129 "nvmf_discovery_get_referrals", 00:04:48.129 "nvmf_discovery_remove_referral", 00:04:48.129 "nvmf_discovery_add_referral", 00:04:48.129 "nvmf_subsystem_remove_listener", 00:04:48.129 "nvmf_subsystem_add_listener", 00:04:48.129 "nvmf_delete_subsystem", 00:04:48.129 "nvmf_create_subsystem", 00:04:48.129 "nvmf_get_subsystems", 00:04:48.129 "nvmf_set_crdt", 00:04:48.129 "nvmf_set_config", 00:04:48.129 "nvmf_set_max_subsystems", 00:04:48.129 "iscsi_set_options", 00:04:48.129 "iscsi_get_auth_groups", 00:04:48.129 "iscsi_auth_group_remove_secret", 00:04:48.129 "iscsi_auth_group_add_secret", 00:04:48.129 "iscsi_delete_auth_group", 00:04:48.129 "iscsi_create_auth_group", 00:04:48.129 "iscsi_set_discovery_auth", 00:04:48.129 "iscsi_get_options", 00:04:48.129 "iscsi_target_node_request_logout", 00:04:48.129 "iscsi_target_node_set_redirect", 00:04:48.129 "iscsi_target_node_set_auth", 00:04:48.129 "iscsi_target_node_add_lun", 00:04:48.129 "iscsi_get_connections", 00:04:48.129 "iscsi_portal_group_set_auth", 00:04:48.129 "iscsi_start_portal_group", 00:04:48.129 "iscsi_delete_portal_group", 00:04:48.129 "iscsi_create_portal_group", 00:04:48.129 "iscsi_get_portal_groups", 00:04:48.129 "iscsi_delete_target_node", 00:04:48.129 "iscsi_target_node_remove_pg_ig_maps", 00:04:48.129 "iscsi_target_node_add_pg_ig_maps", 00:04:48.129 "iscsi_create_target_node", 00:04:48.129 "iscsi_get_target_nodes", 00:04:48.129 "iscsi_delete_initiator_group", 00:04:48.129 "iscsi_initiator_group_remove_initiators", 00:04:48.129 "iscsi_initiator_group_add_initiators", 00:04:48.129 "iscsi_create_initiator_group", 00:04:48.129 "iscsi_get_initiator_groups", 00:04:48.129 "vfu_virtio_create_scsi_endpoint", 00:04:48.129 "vfu_virtio_scsi_remove_target", 00:04:48.129 "vfu_virtio_scsi_add_target", 00:04:48.129 "vfu_virtio_create_blk_endpoint", 00:04:48.129 "vfu_virtio_delete_endpoint", 00:04:48.129 "iaa_scan_accel_module", 00:04:48.129 "dsa_scan_accel_module", 00:04:48.129 "ioat_scan_accel_module", 00:04:48.129 "accel_error_inject_error", 00:04:48.129 "bdev_iscsi_delete", 00:04:48.129 "bdev_iscsi_create", 00:04:48.129 "bdev_iscsi_set_options", 00:04:48.129 "bdev_virtio_attach_controller", 00:04:48.129 "bdev_virtio_scsi_get_devices", 00:04:48.129 "bdev_virtio_detach_controller", 00:04:48.129 "bdev_virtio_blk_set_hotplug", 00:04:48.129 "bdev_ftl_set_property", 00:04:48.129 "bdev_ftl_get_properties", 00:04:48.129 "bdev_ftl_get_stats", 00:04:48.129 "bdev_ftl_unmap", 00:04:48.129 "bdev_ftl_unload", 00:04:48.129 "bdev_ftl_delete", 00:04:48.129 "bdev_ftl_load", 00:04:48.129 "bdev_ftl_create", 00:04:48.129 "bdev_aio_delete", 00:04:48.129 "bdev_aio_rescan", 00:04:48.129 "bdev_aio_create", 00:04:48.129 "blobfs_create", 00:04:48.129 "blobfs_detect", 00:04:48.129 "blobfs_set_cache_size", 00:04:48.129 "bdev_zone_block_delete", 00:04:48.129 "bdev_zone_block_create", 00:04:48.129 "bdev_delay_delete", 00:04:48.129 "bdev_delay_create", 00:04:48.129 "bdev_delay_update_latency", 00:04:48.129 "bdev_split_delete", 00:04:48.129 "bdev_split_create", 00:04:48.129 "bdev_error_inject_error", 00:04:48.129 "bdev_error_delete", 00:04:48.129 "bdev_error_create", 00:04:48.129 "bdev_raid_set_options", 00:04:48.129 "bdev_raid_remove_base_bdev", 00:04:48.129 "bdev_raid_add_base_bdev", 00:04:48.129 "bdev_raid_delete", 00:04:48.129 "bdev_raid_create", 00:04:48.129 "bdev_raid_get_bdevs", 00:04:48.129 "bdev_lvol_grow_lvstore", 00:04:48.129 "bdev_lvol_get_lvols", 00:04:48.129 "bdev_lvol_get_lvstores", 00:04:48.129 "bdev_lvol_delete", 00:04:48.129 "bdev_lvol_set_read_only", 00:04:48.129 "bdev_lvol_resize", 00:04:48.129 "bdev_lvol_decouple_parent", 00:04:48.129 "bdev_lvol_inflate", 00:04:48.129 "bdev_lvol_rename", 00:04:48.129 "bdev_lvol_clone_bdev", 00:04:48.129 "bdev_lvol_clone", 00:04:48.129 "bdev_lvol_snapshot", 00:04:48.129 "bdev_lvol_create", 00:04:48.129 "bdev_lvol_delete_lvstore", 00:04:48.129 "bdev_lvol_rename_lvstore", 00:04:48.129 "bdev_lvol_create_lvstore", 00:04:48.129 "bdev_passthru_delete", 00:04:48.129 "bdev_passthru_create", 00:04:48.129 "bdev_nvme_cuse_unregister", 00:04:48.129 "bdev_nvme_cuse_register", 00:04:48.129 "bdev_opal_new_user", 00:04:48.129 "bdev_opal_set_lock_state", 00:04:48.129 "bdev_opal_delete", 00:04:48.129 "bdev_opal_get_info", 00:04:48.129 "bdev_opal_create", 00:04:48.129 "bdev_nvme_opal_revert", 00:04:48.129 "bdev_nvme_opal_init", 00:04:48.129 "bdev_nvme_send_cmd", 00:04:48.129 "bdev_nvme_get_path_iostat", 00:04:48.129 "bdev_nvme_get_mdns_discovery_info", 00:04:48.129 "bdev_nvme_stop_mdns_discovery", 00:04:48.129 "bdev_nvme_start_mdns_discovery", 00:04:48.129 "bdev_nvme_set_multipath_policy", 00:04:48.129 "bdev_nvme_set_preferred_path", 00:04:48.129 "bdev_nvme_get_io_paths", 00:04:48.129 "bdev_nvme_remove_error_injection", 00:04:48.129 "bdev_nvme_add_error_injection", 00:04:48.129 "bdev_nvme_get_discovery_info", 00:04:48.129 "bdev_nvme_stop_discovery", 00:04:48.129 "bdev_nvme_start_discovery", 00:04:48.129 "bdev_nvme_get_controller_health_info", 00:04:48.129 "bdev_nvme_disable_controller", 00:04:48.129 "bdev_nvme_enable_controller", 00:04:48.129 "bdev_nvme_reset_controller", 00:04:48.129 "bdev_nvme_get_transport_statistics", 00:04:48.129 "bdev_nvme_apply_firmware", 00:04:48.129 "bdev_nvme_detach_controller", 00:04:48.129 "bdev_nvme_get_controllers", 00:04:48.129 "bdev_nvme_attach_controller", 00:04:48.129 "bdev_nvme_set_hotplug", 00:04:48.129 "bdev_nvme_set_options", 00:04:48.129 "bdev_null_resize", 00:04:48.129 "bdev_null_delete", 00:04:48.129 "bdev_null_create", 00:04:48.129 "bdev_malloc_delete", 00:04:48.129 "bdev_malloc_create" 00:04:48.129 ] 00:04:48.129 11:28:17 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:04:48.129 11:28:17 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:48.129 11:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:48.129 11:28:17 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:04:48.129 11:28:17 -- spdkcli/tcp.sh@38 -- # killprocess 2038153 00:04:48.129 11:28:17 -- common/autotest_common.sh@926 -- # '[' -z 2038153 ']' 00:04:48.129 11:28:17 -- common/autotest_common.sh@930 -- # kill -0 2038153 00:04:48.129 11:28:17 -- common/autotest_common.sh@931 -- # uname 00:04:48.129 11:28:17 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:48.129 11:28:17 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2038153 00:04:48.129 11:28:17 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:48.129 11:28:17 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:48.129 11:28:17 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2038153' 00:04:48.129 killing process with pid 2038153 00:04:48.129 11:28:17 -- common/autotest_common.sh@945 -- # kill 2038153 00:04:48.129 11:28:17 -- common/autotest_common.sh@950 -- # wait 2038153 00:04:48.696 00:04:48.696 real 0m1.487s 00:04:48.696 user 0m2.770s 00:04:48.696 sys 0m0.460s 00:04:48.696 11:28:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.696 11:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:48.696 ************************************ 00:04:48.696 END TEST spdkcli_tcp 00:04:48.696 ************************************ 00:04:48.696 11:28:17 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:04:48.696 11:28:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:48.696 11:28:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:48.696 11:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:48.696 ************************************ 00:04:48.696 START TEST dpdk_mem_utility 00:04:48.696 ************************************ 00:04:48.696 11:28:17 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:04:48.696 * Looking for test storage... 00:04:48.696 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:04:48.696 11:28:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:04:48.696 11:28:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2038481 00:04:48.696 11:28:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2038481 00:04:48.696 11:28:17 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:48.696 11:28:17 -- common/autotest_common.sh@819 -- # '[' -z 2038481 ']' 00:04:48.696 11:28:17 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:48.696 11:28:17 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:48.696 11:28:17 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:48.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:48.696 11:28:17 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:48.696 11:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:48.696 [2024-07-21 11:28:17.985393] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:48.696 [2024-07-21 11:28:17.985470] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2038481 ] 00:04:48.696 EAL: No free 2048 kB hugepages reported on node 1 00:04:48.696 [2024-07-21 11:28:18.053739] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:48.696 [2024-07-21 11:28:18.091421] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:48.696 [2024-07-21 11:28:18.091553] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:49.633 11:28:18 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:49.633 11:28:18 -- common/autotest_common.sh@852 -- # return 0 00:04:49.633 11:28:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:04:49.633 11:28:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:04:49.633 11:28:18 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:49.633 11:28:18 -- common/autotest_common.sh@10 -- # set +x 00:04:49.633 { 00:04:49.633 "filename": "/tmp/spdk_mem_dump.txt" 00:04:49.633 } 00:04:49.633 11:28:18 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:49.633 11:28:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:04:49.633 DPDK memory size 814.000000 MiB in 1 heap(s) 00:04:49.633 1 heaps totaling size 814.000000 MiB 00:04:49.633 size: 814.000000 MiB heap id: 0 00:04:49.633 end heaps---------- 00:04:49.633 8 mempools totaling size 598.116089 MiB 00:04:49.633 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:04:49.633 size: 158.602051 MiB name: PDU_data_out_Pool 00:04:49.633 size: 84.521057 MiB name: bdev_io_2038481 00:04:49.633 size: 51.011292 MiB name: evtpool_2038481 00:04:49.633 size: 50.003479 MiB name: msgpool_2038481 00:04:49.633 size: 21.763794 MiB name: PDU_Pool 00:04:49.633 size: 19.513306 MiB name: SCSI_TASK_Pool 00:04:49.633 size: 0.026123 MiB name: Session_Pool 00:04:49.633 end mempools------- 00:04:49.633 6 memzones totaling size 4.142822 MiB 00:04:49.633 size: 1.000366 MiB name: RG_ring_0_2038481 00:04:49.633 size: 1.000366 MiB name: RG_ring_1_2038481 00:04:49.633 size: 1.000366 MiB name: RG_ring_4_2038481 00:04:49.633 size: 1.000366 MiB name: RG_ring_5_2038481 00:04:49.633 size: 0.125366 MiB name: RG_ring_2_2038481 00:04:49.634 size: 0.015991 MiB name: RG_ring_3_2038481 00:04:49.634 end memzones------- 00:04:49.634 11:28:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:04:49.634 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:04:49.634 list of free elements. size: 12.519348 MiB 00:04:49.634 element at address: 0x200000400000 with size: 1.999512 MiB 00:04:49.634 element at address: 0x200018e00000 with size: 0.999878 MiB 00:04:49.634 element at address: 0x200019000000 with size: 0.999878 MiB 00:04:49.634 element at address: 0x200003e00000 with size: 0.996277 MiB 00:04:49.634 element at address: 0x200031c00000 with size: 0.994446 MiB 00:04:49.634 element at address: 0x200013800000 with size: 0.978699 MiB 00:04:49.634 element at address: 0x200007000000 with size: 0.959839 MiB 00:04:49.634 element at address: 0x200019200000 with size: 0.936584 MiB 00:04:49.634 element at address: 0x200000200000 with size: 0.841614 MiB 00:04:49.634 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:04:49.634 element at address: 0x20000b200000 with size: 0.490723 MiB 00:04:49.634 element at address: 0x200000800000 with size: 0.487793 MiB 00:04:49.634 element at address: 0x200019400000 with size: 0.485657 MiB 00:04:49.634 element at address: 0x200027e00000 with size: 0.410034 MiB 00:04:49.634 element at address: 0x200003a00000 with size: 0.355530 MiB 00:04:49.634 list of standard malloc elements. size: 199.218079 MiB 00:04:49.634 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:04:49.634 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:04:49.634 element at address: 0x200018efff80 with size: 1.000122 MiB 00:04:49.634 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:04:49.634 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:04:49.634 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:04:49.634 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:04:49.634 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:04:49.634 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:04:49.634 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003adb300 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003adb500 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003affa80 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003affb40 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:04:49.634 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:04:49.634 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200027e69040 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:04:49.634 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:04:49.634 list of memzone associated elements. size: 602.262573 MiB 00:04:49.634 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:04:49.634 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:04:49.634 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:04:49.634 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:04:49.634 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:04:49.634 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2038481_0 00:04:49.634 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:04:49.634 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2038481_0 00:04:49.634 element at address: 0x200003fff380 with size: 48.003052 MiB 00:04:49.634 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2038481_0 00:04:49.634 element at address: 0x2000195be940 with size: 20.255554 MiB 00:04:49.634 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:04:49.634 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:04:49.634 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:04:49.634 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:04:49.634 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2038481 00:04:49.634 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:04:49.634 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2038481 00:04:49.634 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:04:49.634 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2038481 00:04:49.634 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:04:49.634 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:04:49.634 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:04:49.634 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:04:49.634 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:04:49.634 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:04:49.634 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:04:49.634 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:04:49.634 element at address: 0x200003eff180 with size: 1.000488 MiB 00:04:49.634 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2038481 00:04:49.634 element at address: 0x200003affc00 with size: 1.000488 MiB 00:04:49.634 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2038481 00:04:49.634 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:04:49.634 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2038481 00:04:49.634 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:04:49.634 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2038481 00:04:49.634 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:04:49.634 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2038481 00:04:49.634 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:04:49.634 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:04:49.634 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:04:49.634 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:04:49.634 element at address: 0x20001947c540 with size: 0.250488 MiB 00:04:49.634 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:04:49.634 element at address: 0x200003adf880 with size: 0.125488 MiB 00:04:49.634 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2038481 00:04:49.634 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:04:49.634 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:04:49.634 element at address: 0x200027e69100 with size: 0.023743 MiB 00:04:49.634 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:04:49.634 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:04:49.634 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2038481 00:04:49.634 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:04:49.634 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:04:49.634 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:04:49.634 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2038481 00:04:49.634 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:04:49.634 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2038481 00:04:49.634 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:04:49.634 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:04:49.634 11:28:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:04:49.634 11:28:18 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2038481 00:04:49.634 11:28:18 -- common/autotest_common.sh@926 -- # '[' -z 2038481 ']' 00:04:49.634 11:28:18 -- common/autotest_common.sh@930 -- # kill -0 2038481 00:04:49.634 11:28:18 -- common/autotest_common.sh@931 -- # uname 00:04:49.634 11:28:18 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:49.634 11:28:18 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2038481 00:04:49.634 11:28:18 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:49.634 11:28:18 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:49.634 11:28:18 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2038481' 00:04:49.634 killing process with pid 2038481 00:04:49.634 11:28:18 -- common/autotest_common.sh@945 -- # kill 2038481 00:04:49.634 11:28:18 -- common/autotest_common.sh@950 -- # wait 2038481 00:04:49.894 00:04:49.894 real 0m1.393s 00:04:49.894 user 0m1.442s 00:04:49.894 sys 0m0.428s 00:04:49.894 11:28:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:49.894 11:28:19 -- common/autotest_common.sh@10 -- # set +x 00:04:49.894 ************************************ 00:04:49.894 END TEST dpdk_mem_utility 00:04:49.894 ************************************ 00:04:49.894 11:28:19 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:04:49.894 11:28:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:49.894 11:28:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:49.894 11:28:19 -- common/autotest_common.sh@10 -- # set +x 00:04:49.894 ************************************ 00:04:49.894 START TEST event 00:04:49.894 ************************************ 00:04:49.894 11:28:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:04:50.154 * Looking for test storage... 00:04:50.154 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:04:50.154 11:28:19 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:04:50.154 11:28:19 -- bdev/nbd_common.sh@6 -- # set -e 00:04:50.154 11:28:19 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:04:50.154 11:28:19 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:04:50.154 11:28:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:50.154 11:28:19 -- common/autotest_common.sh@10 -- # set +x 00:04:50.154 ************************************ 00:04:50.154 START TEST event_perf 00:04:50.154 ************************************ 00:04:50.154 11:28:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:04:50.154 Running I/O for 1 seconds...[2024-07-21 11:28:19.424605] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:50.154 [2024-07-21 11:28:19.424704] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2038757 ] 00:04:50.154 EAL: No free 2048 kB hugepages reported on node 1 00:04:50.154 [2024-07-21 11:28:19.497245] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:04:50.154 [2024-07-21 11:28:19.535925] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:50.154 [2024-07-21 11:28:19.536020] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:04:50.154 [2024-07-21 11:28:19.536103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:04:50.154 [2024-07-21 11:28:19.536105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:51.529 Running I/O for 1 seconds... 00:04:51.529 lcore 0: 190416 00:04:51.529 lcore 1: 190416 00:04:51.529 lcore 2: 190417 00:04:51.529 lcore 3: 190417 00:04:51.529 done. 00:04:51.529 00:04:51.529 real 0m1.188s 00:04:51.529 user 0m4.095s 00:04:51.529 sys 0m0.091s 00:04:51.529 11:28:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:51.529 11:28:20 -- common/autotest_common.sh@10 -- # set +x 00:04:51.529 ************************************ 00:04:51.529 END TEST event_perf 00:04:51.529 ************************************ 00:04:51.529 11:28:20 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:04:51.529 11:28:20 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:04:51.529 11:28:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:51.529 11:28:20 -- common/autotest_common.sh@10 -- # set +x 00:04:51.529 ************************************ 00:04:51.529 START TEST event_reactor 00:04:51.529 ************************************ 00:04:51.529 11:28:20 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:04:51.529 [2024-07-21 11:28:20.658256] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:51.529 [2024-07-21 11:28:20.658351] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039013 ] 00:04:51.529 EAL: No free 2048 kB hugepages reported on node 1 00:04:51.529 [2024-07-21 11:28:20.730271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:51.529 [2024-07-21 11:28:20.764532] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:52.464 test_start 00:04:52.464 oneshot 00:04:52.464 tick 100 00:04:52.464 tick 100 00:04:52.464 tick 250 00:04:52.464 tick 100 00:04:52.464 tick 100 00:04:52.464 tick 100 00:04:52.464 tick 500 00:04:52.464 tick 250 00:04:52.464 tick 100 00:04:52.464 tick 100 00:04:52.464 tick 250 00:04:52.464 tick 100 00:04:52.464 tick 100 00:04:52.464 test_end 00:04:52.464 00:04:52.464 real 0m1.179s 00:04:52.464 user 0m1.085s 00:04:52.464 sys 0m0.090s 00:04:52.464 11:28:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:52.464 11:28:21 -- common/autotest_common.sh@10 -- # set +x 00:04:52.464 ************************************ 00:04:52.464 END TEST event_reactor 00:04:52.464 ************************************ 00:04:52.464 11:28:21 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:04:52.464 11:28:21 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:04:52.464 11:28:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:52.464 11:28:21 -- common/autotest_common.sh@10 -- # set +x 00:04:52.464 ************************************ 00:04:52.464 START TEST event_reactor_perf 00:04:52.464 ************************************ 00:04:52.464 11:28:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:04:52.464 [2024-07-21 11:28:21.886533] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:52.464 [2024-07-21 11:28:21.886636] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039295 ] 00:04:52.723 EAL: No free 2048 kB hugepages reported on node 1 00:04:52.723 [2024-07-21 11:28:21.957270] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:52.723 [2024-07-21 11:28:21.991404] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:53.657 test_start 00:04:53.657 test_end 00:04:53.657 Performance: 891301 events per second 00:04:53.657 00:04:53.657 real 0m1.179s 00:04:53.657 user 0m1.091s 00:04:53.657 sys 0m0.084s 00:04:53.657 11:28:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:53.657 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:53.657 ************************************ 00:04:53.657 END TEST event_reactor_perf 00:04:53.657 ************************************ 00:04:53.915 11:28:23 -- event/event.sh@49 -- # uname -s 00:04:53.915 11:28:23 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:04:53.915 11:28:23 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:04:53.915 11:28:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:53.915 11:28:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:53.915 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:53.915 ************************************ 00:04:53.915 START TEST event_scheduler 00:04:53.915 ************************************ 00:04:53.915 11:28:23 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:04:53.915 * Looking for test storage... 00:04:53.915 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:04:53.915 11:28:23 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:04:53.915 11:28:23 -- scheduler/scheduler.sh@35 -- # scheduler_pid=2039602 00:04:53.915 11:28:23 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:04:53.915 11:28:23 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:04:53.915 11:28:23 -- scheduler/scheduler.sh@37 -- # waitforlisten 2039602 00:04:53.915 11:28:23 -- common/autotest_common.sh@819 -- # '[' -z 2039602 ']' 00:04:53.915 11:28:23 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:53.915 11:28:23 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:53.915 11:28:23 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:53.915 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:53.915 11:28:23 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:53.915 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:53.915 [2024-07-21 11:28:23.221288] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:53.915 [2024-07-21 11:28:23.221366] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039602 ] 00:04:53.915 EAL: No free 2048 kB hugepages reported on node 1 00:04:53.915 [2024-07-21 11:28:23.288601] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:04:53.915 [2024-07-21 11:28:23.330673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:53.915 [2024-07-21 11:28:23.330761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:53.915 [2024-07-21 11:28:23.330845] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:04:53.915 [2024-07-21 11:28:23.330846] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:04:54.173 11:28:23 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:54.173 11:28:23 -- common/autotest_common.sh@852 -- # return 0 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 POWER: Env isn't set yet! 00:04:54.173 POWER: Attempting to initialise ACPI cpufreq power management... 00:04:54.173 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:04:54.173 POWER: Cannot set governor of lcore 0 to userspace 00:04:54.173 POWER: Attempting to initialise PSTAT power management... 00:04:54.173 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:04:54.173 POWER: Initialized successfully for lcore 0 power management 00:04:54.173 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:04:54.173 POWER: Initialized successfully for lcore 1 power management 00:04:54.173 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:04:54.173 POWER: Initialized successfully for lcore 2 power management 00:04:54.173 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:04:54.173 POWER: Initialized successfully for lcore 3 power management 00:04:54.173 [2024-07-21 11:28:23.454610] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:04:54.173 [2024-07-21 11:28:23.454625] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:04:54.173 [2024-07-21 11:28:23.454635] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 [2024-07-21 11:28:23.516665] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:04:54.173 11:28:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:54.173 11:28:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 ************************************ 00:04:54.173 START TEST scheduler_create_thread 00:04:54.173 ************************************ 00:04:54.173 11:28:23 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 2 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 3 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 4 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 5 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.173 6 00:04:54.173 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.173 11:28:23 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:04:54.173 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.173 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.432 7 00:04:54.432 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.432 11:28:23 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:04:54.432 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.432 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.432 8 00:04:54.432 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.432 11:28:23 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:04:54.432 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.432 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.432 9 00:04:54.432 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.432 11:28:23 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:04:54.432 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.432 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.432 10 00:04:54.432 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.432 11:28:23 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:04:54.432 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.432 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:54.432 11:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:54.432 11:28:23 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:04:54.432 11:28:23 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:04:54.432 11:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:54.432 11:28:23 -- common/autotest_common.sh@10 -- # set +x 00:04:55.078 11:28:24 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:55.335 11:28:24 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:04:55.335 11:28:24 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:55.335 11:28:24 -- common/autotest_common.sh@10 -- # set +x 00:04:56.706 11:28:25 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:56.706 11:28:25 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:04:56.706 11:28:25 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:04:56.706 11:28:25 -- common/autotest_common.sh@551 -- # xtrace_disable 00:04:56.706 11:28:25 -- common/autotest_common.sh@10 -- # set +x 00:04:57.640 11:28:26 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:04:57.640 00:04:57.640 real 0m3.382s 00:04:57.640 user 0m0.027s 00:04:57.640 sys 0m0.002s 00:04:57.640 11:28:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:57.640 11:28:26 -- common/autotest_common.sh@10 -- # set +x 00:04:57.640 ************************************ 00:04:57.640 END TEST scheduler_create_thread 00:04:57.640 ************************************ 00:04:57.640 11:28:26 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:04:57.640 11:28:26 -- scheduler/scheduler.sh@46 -- # killprocess 2039602 00:04:57.640 11:28:26 -- common/autotest_common.sh@926 -- # '[' -z 2039602 ']' 00:04:57.640 11:28:26 -- common/autotest_common.sh@930 -- # kill -0 2039602 00:04:57.640 11:28:26 -- common/autotest_common.sh@931 -- # uname 00:04:57.640 11:28:26 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:57.640 11:28:26 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2039602 00:04:57.640 11:28:26 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:04:57.640 11:28:27 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:04:57.640 11:28:27 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2039602' 00:04:57.640 killing process with pid 2039602 00:04:57.640 11:28:27 -- common/autotest_common.sh@945 -- # kill 2039602 00:04:57.640 11:28:27 -- common/autotest_common.sh@950 -- # wait 2039602 00:04:57.897 [2024-07-21 11:28:27.288391] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:04:58.156 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:04:58.156 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:04:58.156 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:04:58.156 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:04:58.156 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:04:58.156 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:04:58.156 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:04:58.156 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:04:58.156 00:04:58.156 real 0m4.397s 00:04:58.156 user 0m7.861s 00:04:58.156 sys 0m0.324s 00:04:58.156 11:28:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:58.156 11:28:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.156 ************************************ 00:04:58.156 END TEST event_scheduler 00:04:58.156 ************************************ 00:04:58.156 11:28:27 -- event/event.sh@51 -- # modprobe -n nbd 00:04:58.156 11:28:27 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:04:58.156 11:28:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:58.156 11:28:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:58.156 11:28:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.156 ************************************ 00:04:58.156 START TEST app_repeat 00:04:58.156 ************************************ 00:04:58.156 11:28:27 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:04:58.156 11:28:27 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:58.156 11:28:27 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:58.156 11:28:27 -- event/event.sh@13 -- # local nbd_list 00:04:58.156 11:28:27 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:58.156 11:28:27 -- event/event.sh@14 -- # local bdev_list 00:04:58.156 11:28:27 -- event/event.sh@15 -- # local repeat_times=4 00:04:58.156 11:28:27 -- event/event.sh@17 -- # modprobe nbd 00:04:58.156 11:28:27 -- event/event.sh@19 -- # repeat_pid=2040469 00:04:58.156 11:28:27 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:04:58.156 11:28:27 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:04:58.156 11:28:27 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2040469' 00:04:58.156 Process app_repeat pid: 2040469 00:04:58.156 11:28:27 -- event/event.sh@23 -- # for i in {0..2} 00:04:58.156 11:28:27 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:04:58.156 spdk_app_start Round 0 00:04:58.156 11:28:27 -- event/event.sh@25 -- # waitforlisten 2040469 /var/tmp/spdk-nbd.sock 00:04:58.156 11:28:27 -- common/autotest_common.sh@819 -- # '[' -z 2040469 ']' 00:04:58.156 11:28:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:04:58.156 11:28:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:58.156 11:28:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:04:58.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:04:58.156 11:28:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:58.156 11:28:27 -- common/autotest_common.sh@10 -- # set +x 00:04:58.415 [2024-07-21 11:28:27.582835] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:58.415 [2024-07-21 11:28:27.582929] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2040469 ] 00:04:58.415 EAL: No free 2048 kB hugepages reported on node 1 00:04:58.415 [2024-07-21 11:28:27.653057] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:58.415 [2024-07-21 11:28:27.688850] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:58.415 [2024-07-21 11:28:27.688853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:58.980 11:28:28 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:58.980 11:28:28 -- common/autotest_common.sh@852 -- # return 0 00:04:58.980 11:28:28 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:59.238 Malloc0 00:04:59.238 11:28:28 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:59.496 Malloc1 00:04:59.496 11:28:28 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@12 -- # local i 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:59.496 11:28:28 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:04:59.496 /dev/nbd0 00:04:59.754 11:28:28 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:04:59.754 11:28:28 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:04:59.754 11:28:28 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:04:59.754 11:28:28 -- common/autotest_common.sh@857 -- # local i 00:04:59.754 11:28:28 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:04:59.754 11:28:28 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:04:59.754 11:28:28 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:04:59.754 11:28:28 -- common/autotest_common.sh@861 -- # break 00:04:59.754 11:28:28 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:04:59.754 11:28:28 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:04:59.754 11:28:28 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:59.754 1+0 records in 00:04:59.754 1+0 records out 00:04:59.754 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230281 s, 17.8 MB/s 00:04:59.754 11:28:28 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.754 11:28:28 -- common/autotest_common.sh@874 -- # size=4096 00:04:59.754 11:28:28 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.754 11:28:28 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:04:59.754 11:28:28 -- common/autotest_common.sh@877 -- # return 0 00:04:59.754 11:28:28 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:59.754 11:28:28 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:59.754 11:28:28 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:04:59.754 /dev/nbd1 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:04:59.754 11:28:29 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:04:59.754 11:28:29 -- common/autotest_common.sh@857 -- # local i 00:04:59.754 11:28:29 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:04:59.754 11:28:29 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:04:59.754 11:28:29 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:04:59.754 11:28:29 -- common/autotest_common.sh@861 -- # break 00:04:59.754 11:28:29 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:04:59.754 11:28:29 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:04:59.754 11:28:29 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:59.754 1+0 records in 00:04:59.754 1+0 records out 00:04:59.754 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259984 s, 15.8 MB/s 00:04:59.754 11:28:29 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.754 11:28:29 -- common/autotest_common.sh@874 -- # size=4096 00:04:59.754 11:28:29 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.754 11:28:29 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:04:59.754 11:28:29 -- common/autotest_common.sh@877 -- # return 0 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:59.754 11:28:29 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:00.012 { 00:05:00.012 "nbd_device": "/dev/nbd0", 00:05:00.012 "bdev_name": "Malloc0" 00:05:00.012 }, 00:05:00.012 { 00:05:00.012 "nbd_device": "/dev/nbd1", 00:05:00.012 "bdev_name": "Malloc1" 00:05:00.012 } 00:05:00.012 ]' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:00.012 { 00:05:00.012 "nbd_device": "/dev/nbd0", 00:05:00.012 "bdev_name": "Malloc0" 00:05:00.012 }, 00:05:00.012 { 00:05:00.012 "nbd_device": "/dev/nbd1", 00:05:00.012 "bdev_name": "Malloc1" 00:05:00.012 } 00:05:00.012 ]' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:00.012 /dev/nbd1' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:00.012 /dev/nbd1' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@65 -- # count=2 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@95 -- # count=2 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:00.012 256+0 records in 00:05:00.012 256+0 records out 00:05:00.012 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111472 s, 94.1 MB/s 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:00.012 256+0 records in 00:05:00.012 256+0 records out 00:05:00.012 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198645 s, 52.8 MB/s 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:00.012 11:28:29 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:00.270 256+0 records in 00:05:00.270 256+0 records out 00:05:00.270 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216003 s, 48.5 MB/s 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@51 -- # local i 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@41 -- # break 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@45 -- # return 0 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:00.270 11:28:29 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@41 -- # break 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@45 -- # return 0 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:00.529 11:28:29 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@65 -- # true 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@65 -- # count=0 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@104 -- # count=0 00:05:00.787 11:28:30 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:00.788 11:28:30 -- bdev/nbd_common.sh@109 -- # return 0 00:05:00.788 11:28:30 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:01.045 11:28:30 -- event/event.sh@35 -- # sleep 3 00:05:01.045 [2024-07-21 11:28:30.427212] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:01.045 [2024-07-21 11:28:30.460093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:01.045 [2024-07-21 11:28:30.460096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:01.304 [2024-07-21 11:28:30.501328] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:01.304 [2024-07-21 11:28:30.501370] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:03.833 11:28:33 -- event/event.sh@23 -- # for i in {0..2} 00:05:03.833 11:28:33 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:03.833 spdk_app_start Round 1 00:05:03.833 11:28:33 -- event/event.sh@25 -- # waitforlisten 2040469 /var/tmp/spdk-nbd.sock 00:05:03.833 11:28:33 -- common/autotest_common.sh@819 -- # '[' -z 2040469 ']' 00:05:03.833 11:28:33 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:03.833 11:28:33 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:03.833 11:28:33 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:03.833 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:04.091 11:28:33 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:04.091 11:28:33 -- common/autotest_common.sh@10 -- # set +x 00:05:04.091 11:28:33 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:04.091 11:28:33 -- common/autotest_common.sh@852 -- # return 0 00:05:04.091 11:28:33 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:04.349 Malloc0 00:05:04.349 11:28:33 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:04.607 Malloc1 00:05:04.607 11:28:33 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@12 -- # local i 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:04.607 /dev/nbd0 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:04.607 11:28:33 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:04.607 11:28:33 -- common/autotest_common.sh@857 -- # local i 00:05:04.607 11:28:33 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:04.607 11:28:33 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:04.607 11:28:33 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:04.607 11:28:33 -- common/autotest_common.sh@861 -- # break 00:05:04.607 11:28:33 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:04.607 11:28:33 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:04.607 11:28:33 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:04.607 1+0 records in 00:05:04.607 1+0 records out 00:05:04.607 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266672 s, 15.4 MB/s 00:05:04.607 11:28:33 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.607 11:28:33 -- common/autotest_common.sh@874 -- # size=4096 00:05:04.607 11:28:33 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.607 11:28:33 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:04.607 11:28:33 -- common/autotest_common.sh@877 -- # return 0 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:04.607 11:28:33 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:04.864 /dev/nbd1 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:04.864 11:28:34 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:04.864 11:28:34 -- common/autotest_common.sh@857 -- # local i 00:05:04.864 11:28:34 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:04.864 11:28:34 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:04.864 11:28:34 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:04.864 11:28:34 -- common/autotest_common.sh@861 -- # break 00:05:04.864 11:28:34 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:04.864 11:28:34 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:04.864 11:28:34 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:04.864 1+0 records in 00:05:04.864 1+0 records out 00:05:04.864 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000185262 s, 22.1 MB/s 00:05:04.864 11:28:34 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.864 11:28:34 -- common/autotest_common.sh@874 -- # size=4096 00:05:04.864 11:28:34 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.864 11:28:34 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:04.864 11:28:34 -- common/autotest_common.sh@877 -- # return 0 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:04.864 11:28:34 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:05.122 { 00:05:05.122 "nbd_device": "/dev/nbd0", 00:05:05.122 "bdev_name": "Malloc0" 00:05:05.122 }, 00:05:05.122 { 00:05:05.122 "nbd_device": "/dev/nbd1", 00:05:05.122 "bdev_name": "Malloc1" 00:05:05.122 } 00:05:05.122 ]' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:05.122 { 00:05:05.122 "nbd_device": "/dev/nbd0", 00:05:05.122 "bdev_name": "Malloc0" 00:05:05.122 }, 00:05:05.122 { 00:05:05.122 "nbd_device": "/dev/nbd1", 00:05:05.122 "bdev_name": "Malloc1" 00:05:05.122 } 00:05:05.122 ]' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:05.122 /dev/nbd1' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:05.122 /dev/nbd1' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@65 -- # count=2 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@95 -- # count=2 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:05.122 256+0 records in 00:05:05.122 256+0 records out 00:05:05.122 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114166 s, 91.8 MB/s 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:05.122 256+0 records in 00:05:05.122 256+0 records out 00:05:05.122 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202862 s, 51.7 MB/s 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:05.122 256+0 records in 00:05:05.122 256+0 records out 00:05:05.122 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215126 s, 48.7 MB/s 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:05.122 11:28:34 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@51 -- # local i 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:05.123 11:28:34 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:05.380 11:28:34 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:05.380 11:28:34 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:05.380 11:28:34 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:05.380 11:28:34 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:05.380 11:28:34 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:05.381 11:28:34 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:05.381 11:28:34 -- bdev/nbd_common.sh@41 -- # break 00:05:05.381 11:28:34 -- bdev/nbd_common.sh@45 -- # return 0 00:05:05.381 11:28:34 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:05.381 11:28:34 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@41 -- # break 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@45 -- # return 0 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:05.639 11:28:34 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:05.639 11:28:35 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:05.639 11:28:35 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:05.639 11:28:35 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@65 -- # true 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@65 -- # count=0 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@104 -- # count=0 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:05.898 11:28:35 -- bdev/nbd_common.sh@109 -- # return 0 00:05:05.898 11:28:35 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:05.898 11:28:35 -- event/event.sh@35 -- # sleep 3 00:05:06.157 [2024-07-21 11:28:35.461196] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:06.157 [2024-07-21 11:28:35.494360] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:06.157 [2024-07-21 11:28:35.494362] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:06.157 [2024-07-21 11:28:35.535131] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:06.157 [2024-07-21 11:28:35.535176] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:09.438 11:28:38 -- event/event.sh@23 -- # for i in {0..2} 00:05:09.438 11:28:38 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:09.438 spdk_app_start Round 2 00:05:09.438 11:28:38 -- event/event.sh@25 -- # waitforlisten 2040469 /var/tmp/spdk-nbd.sock 00:05:09.438 11:28:38 -- common/autotest_common.sh@819 -- # '[' -z 2040469 ']' 00:05:09.438 11:28:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:09.438 11:28:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:09.438 11:28:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:09.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:09.438 11:28:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:09.438 11:28:38 -- common/autotest_common.sh@10 -- # set +x 00:05:09.438 11:28:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:09.438 11:28:38 -- common/autotest_common.sh@852 -- # return 0 00:05:09.438 11:28:38 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:09.438 Malloc0 00:05:09.438 11:28:38 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:09.438 Malloc1 00:05:09.438 11:28:38 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@12 -- # local i 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:09.438 11:28:38 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:09.697 /dev/nbd0 00:05:09.697 11:28:38 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:09.697 11:28:38 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:09.697 11:28:38 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:09.697 11:28:38 -- common/autotest_common.sh@857 -- # local i 00:05:09.697 11:28:38 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:09.697 11:28:38 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:09.697 11:28:38 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:09.697 11:28:38 -- common/autotest_common.sh@861 -- # break 00:05:09.697 11:28:38 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:09.697 11:28:38 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:09.697 11:28:38 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:09.697 1+0 records in 00:05:09.697 1+0 records out 00:05:09.697 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228315 s, 17.9 MB/s 00:05:09.697 11:28:38 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.697 11:28:39 -- common/autotest_common.sh@874 -- # size=4096 00:05:09.697 11:28:39 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.697 11:28:39 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:09.697 11:28:39 -- common/autotest_common.sh@877 -- # return 0 00:05:09.697 11:28:39 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:09.697 11:28:39 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:09.697 11:28:39 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:09.955 /dev/nbd1 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:09.956 11:28:39 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:09.956 11:28:39 -- common/autotest_common.sh@857 -- # local i 00:05:09.956 11:28:39 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:09.956 11:28:39 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:09.956 11:28:39 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:09.956 11:28:39 -- common/autotest_common.sh@861 -- # break 00:05:09.956 11:28:39 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:09.956 11:28:39 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:09.956 11:28:39 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:09.956 1+0 records in 00:05:09.956 1+0 records out 00:05:09.956 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241885 s, 16.9 MB/s 00:05:09.956 11:28:39 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.956 11:28:39 -- common/autotest_common.sh@874 -- # size=4096 00:05:09.956 11:28:39 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.956 11:28:39 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:09.956 11:28:39 -- common/autotest_common.sh@877 -- # return 0 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:09.956 { 00:05:09.956 "nbd_device": "/dev/nbd0", 00:05:09.956 "bdev_name": "Malloc0" 00:05:09.956 }, 00:05:09.956 { 00:05:09.956 "nbd_device": "/dev/nbd1", 00:05:09.956 "bdev_name": "Malloc1" 00:05:09.956 } 00:05:09.956 ]' 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:09.956 { 00:05:09.956 "nbd_device": "/dev/nbd0", 00:05:09.956 "bdev_name": "Malloc0" 00:05:09.956 }, 00:05:09.956 { 00:05:09.956 "nbd_device": "/dev/nbd1", 00:05:09.956 "bdev_name": "Malloc1" 00:05:09.956 } 00:05:09.956 ]' 00:05:09.956 11:28:39 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:10.214 /dev/nbd1' 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:10.214 /dev/nbd1' 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@65 -- # count=2 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@95 -- # count=2 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:10.214 256+0 records in 00:05:10.214 256+0 records out 00:05:10.214 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114383 s, 91.7 MB/s 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:10.214 256+0 records in 00:05:10.214 256+0 records out 00:05:10.214 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201978 s, 51.9 MB/s 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:10.214 256+0 records in 00:05:10.214 256+0 records out 00:05:10.214 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214002 s, 49.0 MB/s 00:05:10.214 11:28:39 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@51 -- # local i 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:10.215 11:28:39 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@41 -- # break 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@45 -- # return 0 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:10.473 11:28:39 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:10.732 11:28:39 -- bdev/nbd_common.sh@41 -- # break 00:05:10.732 11:28:39 -- bdev/nbd_common.sh@45 -- # return 0 00:05:10.732 11:28:39 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:10.732 11:28:39 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:10.732 11:28:39 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@65 -- # true 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@65 -- # count=0 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@104 -- # count=0 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:10.732 11:28:40 -- bdev/nbd_common.sh@109 -- # return 0 00:05:10.732 11:28:40 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:10.992 11:28:40 -- event/event.sh@35 -- # sleep 3 00:05:11.251 [2024-07-21 11:28:40.498133] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:11.251 [2024-07-21 11:28:40.531088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:11.251 [2024-07-21 11:28:40.531090] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:11.251 [2024-07-21 11:28:40.571894] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:11.251 [2024-07-21 11:28:40.571939] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:14.538 11:28:43 -- event/event.sh@38 -- # waitforlisten 2040469 /var/tmp/spdk-nbd.sock 00:05:14.538 11:28:43 -- common/autotest_common.sh@819 -- # '[' -z 2040469 ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:14.538 11:28:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:14.538 11:28:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:14.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:14.538 11:28:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:14.538 11:28:43 -- common/autotest_common.sh@10 -- # set +x 00:05:14.538 11:28:43 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:14.538 11:28:43 -- common/autotest_common.sh@852 -- # return 0 00:05:14.538 11:28:43 -- event/event.sh@39 -- # killprocess 2040469 00:05:14.538 11:28:43 -- common/autotest_common.sh@926 -- # '[' -z 2040469 ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@930 -- # kill -0 2040469 00:05:14.538 11:28:43 -- common/autotest_common.sh@931 -- # uname 00:05:14.538 11:28:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2040469 00:05:14.538 11:28:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:14.538 11:28:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2040469' 00:05:14.538 killing process with pid 2040469 00:05:14.538 11:28:43 -- common/autotest_common.sh@945 -- # kill 2040469 00:05:14.538 11:28:43 -- common/autotest_common.sh@950 -- # wait 2040469 00:05:14.538 spdk_app_start is called in Round 0. 00:05:14.538 Shutdown signal received, stop current app iteration 00:05:14.538 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:14.538 spdk_app_start is called in Round 1. 00:05:14.538 Shutdown signal received, stop current app iteration 00:05:14.538 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:14.538 spdk_app_start is called in Round 2. 00:05:14.538 Shutdown signal received, stop current app iteration 00:05:14.538 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:14.538 spdk_app_start is called in Round 3. 00:05:14.538 Shutdown signal received, stop current app iteration 00:05:14.538 11:28:43 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:14.538 11:28:43 -- event/event.sh@42 -- # return 0 00:05:14.538 00:05:14.538 real 0m16.146s 00:05:14.538 user 0m34.234s 00:05:14.538 sys 0m3.173s 00:05:14.538 11:28:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.538 11:28:43 -- common/autotest_common.sh@10 -- # set +x 00:05:14.538 ************************************ 00:05:14.538 END TEST app_repeat 00:05:14.538 ************************************ 00:05:14.538 11:28:43 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:14.538 11:28:43 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:14.538 11:28:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:14.538 11:28:43 -- common/autotest_common.sh@10 -- # set +x 00:05:14.538 ************************************ 00:05:14.538 START TEST cpu_locks 00:05:14.538 ************************************ 00:05:14.538 11:28:43 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:14.538 * Looking for test storage... 00:05:14.538 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:14.538 11:28:43 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:14.538 11:28:43 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:14.538 11:28:43 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:14.538 11:28:43 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:14.538 11:28:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:14.538 11:28:43 -- common/autotest_common.sh@10 -- # set +x 00:05:14.538 ************************************ 00:05:14.538 START TEST default_locks 00:05:14.538 ************************************ 00:05:14.538 11:28:43 -- common/autotest_common.sh@1104 -- # default_locks 00:05:14.538 11:28:43 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=2043407 00:05:14.538 11:28:43 -- event/cpu_locks.sh@47 -- # waitforlisten 2043407 00:05:14.538 11:28:43 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:14.538 11:28:43 -- common/autotest_common.sh@819 -- # '[' -z 2043407 ']' 00:05:14.538 11:28:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.538 11:28:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:14.538 11:28:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.538 11:28:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:14.538 11:28:43 -- common/autotest_common.sh@10 -- # set +x 00:05:14.538 [2024-07-21 11:28:43.886727] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:14.538 [2024-07-21 11:28:43.886823] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043407 ] 00:05:14.538 EAL: No free 2048 kB hugepages reported on node 1 00:05:14.538 [2024-07-21 11:28:43.955671] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.796 [2024-07-21 11:28:43.992675] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:14.796 [2024-07-21 11:28:43.992800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:15.361 11:28:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:15.361 11:28:44 -- common/autotest_common.sh@852 -- # return 0 00:05:15.361 11:28:44 -- event/cpu_locks.sh@49 -- # locks_exist 2043407 00:05:15.361 11:28:44 -- event/cpu_locks.sh@22 -- # lslocks -p 2043407 00:05:15.361 11:28:44 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:15.926 lslocks: write error 00:05:15.926 11:28:45 -- event/cpu_locks.sh@50 -- # killprocess 2043407 00:05:15.926 11:28:45 -- common/autotest_common.sh@926 -- # '[' -z 2043407 ']' 00:05:15.926 11:28:45 -- common/autotest_common.sh@930 -- # kill -0 2043407 00:05:15.926 11:28:45 -- common/autotest_common.sh@931 -- # uname 00:05:15.926 11:28:45 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:15.926 11:28:45 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2043407 00:05:16.185 11:28:45 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:16.185 11:28:45 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:16.185 11:28:45 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2043407' 00:05:16.185 killing process with pid 2043407 00:05:16.185 11:28:45 -- common/autotest_common.sh@945 -- # kill 2043407 00:05:16.185 11:28:45 -- common/autotest_common.sh@950 -- # wait 2043407 00:05:16.444 11:28:45 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 2043407 00:05:16.444 11:28:45 -- common/autotest_common.sh@640 -- # local es=0 00:05:16.444 11:28:45 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2043407 00:05:16.444 11:28:45 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:16.444 11:28:45 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:16.444 11:28:45 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:16.444 11:28:45 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:16.444 11:28:45 -- common/autotest_common.sh@643 -- # waitforlisten 2043407 00:05:16.444 11:28:45 -- common/autotest_common.sh@819 -- # '[' -z 2043407 ']' 00:05:16.444 11:28:45 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:16.444 11:28:45 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:16.444 11:28:45 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:16.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:16.445 11:28:45 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:16.445 11:28:45 -- common/autotest_common.sh@10 -- # set +x 00:05:16.445 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2043407) - No such process 00:05:16.445 ERROR: process (pid: 2043407) is no longer running 00:05:16.445 11:28:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:16.445 11:28:45 -- common/autotest_common.sh@852 -- # return 1 00:05:16.445 11:28:45 -- common/autotest_common.sh@643 -- # es=1 00:05:16.445 11:28:45 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:16.445 11:28:45 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:16.445 11:28:45 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:16.445 11:28:45 -- event/cpu_locks.sh@54 -- # no_locks 00:05:16.445 11:28:45 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:16.445 11:28:45 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:16.445 11:28:45 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:16.445 00:05:16.445 real 0m1.834s 00:05:16.445 user 0m1.925s 00:05:16.445 sys 0m0.716s 00:05:16.445 11:28:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:16.445 11:28:45 -- common/autotest_common.sh@10 -- # set +x 00:05:16.445 ************************************ 00:05:16.445 END TEST default_locks 00:05:16.445 ************************************ 00:05:16.445 11:28:45 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:16.445 11:28:45 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.445 11:28:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.445 11:28:45 -- common/autotest_common.sh@10 -- # set +x 00:05:16.445 ************************************ 00:05:16.445 START TEST default_locks_via_rpc 00:05:16.445 ************************************ 00:05:16.445 11:28:45 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:05:16.445 11:28:45 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:16.445 11:28:45 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=2043797 00:05:16.445 11:28:45 -- event/cpu_locks.sh@63 -- # waitforlisten 2043797 00:05:16.445 11:28:45 -- common/autotest_common.sh@819 -- # '[' -z 2043797 ']' 00:05:16.445 11:28:45 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:16.445 11:28:45 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:16.445 11:28:45 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:16.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:16.445 11:28:45 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:16.445 11:28:45 -- common/autotest_common.sh@10 -- # set +x 00:05:16.445 [2024-07-21 11:28:45.764608] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:16.445 [2024-07-21 11:28:45.764682] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043797 ] 00:05:16.445 EAL: No free 2048 kB hugepages reported on node 1 00:05:16.445 [2024-07-21 11:28:45.834153] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:16.704 [2024-07-21 11:28:45.871720] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:16.704 [2024-07-21 11:28:45.871840] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:17.270 11:28:46 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:17.270 11:28:46 -- common/autotest_common.sh@852 -- # return 0 00:05:17.270 11:28:46 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:17.270 11:28:46 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.270 11:28:46 -- common/autotest_common.sh@10 -- # set +x 00:05:17.270 11:28:46 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.270 11:28:46 -- event/cpu_locks.sh@67 -- # no_locks 00:05:17.270 11:28:46 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:17.270 11:28:46 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:17.270 11:28:46 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:17.270 11:28:46 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:17.270 11:28:46 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.270 11:28:46 -- common/autotest_common.sh@10 -- # set +x 00:05:17.270 11:28:46 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.270 11:28:46 -- event/cpu_locks.sh@71 -- # locks_exist 2043797 00:05:17.270 11:28:46 -- event/cpu_locks.sh@22 -- # lslocks -p 2043797 00:05:17.270 11:28:46 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:17.529 11:28:46 -- event/cpu_locks.sh@73 -- # killprocess 2043797 00:05:17.529 11:28:46 -- common/autotest_common.sh@926 -- # '[' -z 2043797 ']' 00:05:17.529 11:28:46 -- common/autotest_common.sh@930 -- # kill -0 2043797 00:05:17.529 11:28:46 -- common/autotest_common.sh@931 -- # uname 00:05:17.529 11:28:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:17.529 11:28:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2043797 00:05:17.529 11:28:46 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:17.529 11:28:46 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:17.529 11:28:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2043797' 00:05:17.529 killing process with pid 2043797 00:05:17.529 11:28:46 -- common/autotest_common.sh@945 -- # kill 2043797 00:05:17.529 11:28:46 -- common/autotest_common.sh@950 -- # wait 2043797 00:05:17.787 00:05:17.787 real 0m1.417s 00:05:17.787 user 0m1.476s 00:05:17.787 sys 0m0.479s 00:05:17.787 11:28:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.787 11:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:17.787 ************************************ 00:05:17.787 END TEST default_locks_via_rpc 00:05:17.787 ************************************ 00:05:17.787 11:28:47 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:17.787 11:28:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:17.787 11:28:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.787 11:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:18.047 ************************************ 00:05:18.047 START TEST non_locking_app_on_locked_coremask 00:05:18.047 ************************************ 00:05:18.047 11:28:47 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:05:18.047 11:28:47 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=2044063 00:05:18.047 11:28:47 -- event/cpu_locks.sh@81 -- # waitforlisten 2044063 /var/tmp/spdk.sock 00:05:18.047 11:28:47 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:18.047 11:28:47 -- common/autotest_common.sh@819 -- # '[' -z 2044063 ']' 00:05:18.047 11:28:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:18.047 11:28:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:18.047 11:28:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:18.047 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:18.047 11:28:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:18.047 11:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:18.047 [2024-07-21 11:28:47.238689] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:18.047 [2024-07-21 11:28:47.238767] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044063 ] 00:05:18.047 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.047 [2024-07-21 11:28:47.307786] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.047 [2024-07-21 11:28:47.345402] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:18.047 [2024-07-21 11:28:47.345523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.985 11:28:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:18.985 11:28:48 -- common/autotest_common.sh@852 -- # return 0 00:05:18.985 11:28:48 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:18.985 11:28:48 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=2044276 00:05:18.985 11:28:48 -- event/cpu_locks.sh@85 -- # waitforlisten 2044276 /var/tmp/spdk2.sock 00:05:18.985 11:28:48 -- common/autotest_common.sh@819 -- # '[' -z 2044276 ']' 00:05:18.985 11:28:48 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:18.985 11:28:48 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:18.985 11:28:48 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:18.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:18.985 11:28:48 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:18.985 11:28:48 -- common/autotest_common.sh@10 -- # set +x 00:05:18.985 [2024-07-21 11:28:48.064623] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:18.985 [2024-07-21 11:28:48.064673] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044276 ] 00:05:18.985 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.985 [2024-07-21 11:28:48.155969] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:18.985 [2024-07-21 11:28:48.155997] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.985 [2024-07-21 11:28:48.230104] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:18.985 [2024-07-21 11:28:48.230222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.553 11:28:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:19.553 11:28:48 -- common/autotest_common.sh@852 -- # return 0 00:05:19.553 11:28:48 -- event/cpu_locks.sh@87 -- # locks_exist 2044063 00:05:19.553 11:28:48 -- event/cpu_locks.sh@22 -- # lslocks -p 2044063 00:05:19.553 11:28:48 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:20.123 lslocks: write error 00:05:20.123 11:28:49 -- event/cpu_locks.sh@89 -- # killprocess 2044063 00:05:20.123 11:28:49 -- common/autotest_common.sh@926 -- # '[' -z 2044063 ']' 00:05:20.123 11:28:49 -- common/autotest_common.sh@930 -- # kill -0 2044063 00:05:20.123 11:28:49 -- common/autotest_common.sh@931 -- # uname 00:05:20.123 11:28:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:20.123 11:28:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2044063 00:05:20.123 11:28:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:20.123 11:28:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:20.123 11:28:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2044063' 00:05:20.123 killing process with pid 2044063 00:05:20.123 11:28:49 -- common/autotest_common.sh@945 -- # kill 2044063 00:05:20.123 11:28:49 -- common/autotest_common.sh@950 -- # wait 2044063 00:05:21.061 11:28:50 -- event/cpu_locks.sh@90 -- # killprocess 2044276 00:05:21.061 11:28:50 -- common/autotest_common.sh@926 -- # '[' -z 2044276 ']' 00:05:21.061 11:28:50 -- common/autotest_common.sh@930 -- # kill -0 2044276 00:05:21.061 11:28:50 -- common/autotest_common.sh@931 -- # uname 00:05:21.061 11:28:50 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:21.061 11:28:50 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2044276 00:05:21.061 11:28:50 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:21.061 11:28:50 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:21.061 11:28:50 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2044276' 00:05:21.061 killing process with pid 2044276 00:05:21.061 11:28:50 -- common/autotest_common.sh@945 -- # kill 2044276 00:05:21.061 11:28:50 -- common/autotest_common.sh@950 -- # wait 2044276 00:05:21.061 00:05:21.061 real 0m3.244s 00:05:21.061 user 0m3.409s 00:05:21.061 sys 0m1.043s 00:05:21.061 11:28:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.061 11:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:21.061 ************************************ 00:05:21.061 END TEST non_locking_app_on_locked_coremask 00:05:21.061 ************************************ 00:05:21.321 11:28:50 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:21.321 11:28:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:21.321 11:28:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:21.321 11:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:21.321 ************************************ 00:05:21.321 START TEST locking_app_on_unlocked_coremask 00:05:21.321 ************************************ 00:05:21.321 11:28:50 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:05:21.321 11:28:50 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=2044743 00:05:21.321 11:28:50 -- event/cpu_locks.sh@99 -- # waitforlisten 2044743 /var/tmp/spdk.sock 00:05:21.321 11:28:50 -- common/autotest_common.sh@819 -- # '[' -z 2044743 ']' 00:05:21.321 11:28:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:21.321 11:28:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:21.321 11:28:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:21.321 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:21.321 11:28:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:21.321 11:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:21.321 11:28:50 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:21.321 [2024-07-21 11:28:50.524768] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:21.321 [2024-07-21 11:28:50.524836] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044743 ] 00:05:21.321 EAL: No free 2048 kB hugepages reported on node 1 00:05:21.321 [2024-07-21 11:28:50.592481] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:21.321 [2024-07-21 11:28:50.592508] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:21.321 [2024-07-21 11:28:50.630386] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:21.321 [2024-07-21 11:28:50.630513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.259 11:28:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:22.259 11:28:51 -- common/autotest_common.sh@852 -- # return 0 00:05:22.259 11:28:51 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=2044854 00:05:22.259 11:28:51 -- event/cpu_locks.sh@103 -- # waitforlisten 2044854 /var/tmp/spdk2.sock 00:05:22.259 11:28:51 -- common/autotest_common.sh@819 -- # '[' -z 2044854 ']' 00:05:22.259 11:28:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:22.259 11:28:51 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:22.259 11:28:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:22.259 11:28:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:22.259 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:22.259 11:28:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:22.259 11:28:51 -- common/autotest_common.sh@10 -- # set +x 00:05:22.259 [2024-07-21 11:28:51.336954] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:22.260 [2024-07-21 11:28:51.337013] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044854 ] 00:05:22.260 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.260 [2024-07-21 11:28:51.423929] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.260 [2024-07-21 11:28:51.499524] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:22.260 [2024-07-21 11:28:51.499649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.828 11:28:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:22.828 11:28:52 -- common/autotest_common.sh@852 -- # return 0 00:05:22.828 11:28:52 -- event/cpu_locks.sh@105 -- # locks_exist 2044854 00:05:22.828 11:28:52 -- event/cpu_locks.sh@22 -- # lslocks -p 2044854 00:05:22.828 11:28:52 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:23.850 lslocks: write error 00:05:23.850 11:28:53 -- event/cpu_locks.sh@107 -- # killprocess 2044743 00:05:23.850 11:28:53 -- common/autotest_common.sh@926 -- # '[' -z 2044743 ']' 00:05:23.850 11:28:53 -- common/autotest_common.sh@930 -- # kill -0 2044743 00:05:23.850 11:28:53 -- common/autotest_common.sh@931 -- # uname 00:05:23.850 11:28:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:23.850 11:28:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2044743 00:05:24.108 11:28:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:24.108 11:28:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:24.108 11:28:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2044743' 00:05:24.108 killing process with pid 2044743 00:05:24.108 11:28:53 -- common/autotest_common.sh@945 -- # kill 2044743 00:05:24.108 11:28:53 -- common/autotest_common.sh@950 -- # wait 2044743 00:05:24.674 11:28:53 -- event/cpu_locks.sh@108 -- # killprocess 2044854 00:05:24.674 11:28:53 -- common/autotest_common.sh@926 -- # '[' -z 2044854 ']' 00:05:24.674 11:28:53 -- common/autotest_common.sh@930 -- # kill -0 2044854 00:05:24.674 11:28:53 -- common/autotest_common.sh@931 -- # uname 00:05:24.674 11:28:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:24.674 11:28:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2044854 00:05:24.674 11:28:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:24.674 11:28:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:24.674 11:28:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2044854' 00:05:24.674 killing process with pid 2044854 00:05:24.674 11:28:53 -- common/autotest_common.sh@945 -- # kill 2044854 00:05:24.674 11:28:53 -- common/autotest_common.sh@950 -- # wait 2044854 00:05:24.932 00:05:24.932 real 0m3.710s 00:05:24.932 user 0m3.913s 00:05:24.932 sys 0m1.255s 00:05:24.932 11:28:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:24.932 11:28:54 -- common/autotest_common.sh@10 -- # set +x 00:05:24.932 ************************************ 00:05:24.932 END TEST locking_app_on_unlocked_coremask 00:05:24.932 ************************************ 00:05:24.932 11:28:54 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:24.932 11:28:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:24.932 11:28:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:24.932 11:28:54 -- common/autotest_common.sh@10 -- # set +x 00:05:24.932 ************************************ 00:05:24.932 START TEST locking_app_on_locked_coremask 00:05:24.932 ************************************ 00:05:24.932 11:28:54 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:05:24.932 11:28:54 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=2045434 00:05:24.932 11:28:54 -- event/cpu_locks.sh@116 -- # waitforlisten 2045434 /var/tmp/spdk.sock 00:05:24.932 11:28:54 -- common/autotest_common.sh@819 -- # '[' -z 2045434 ']' 00:05:24.932 11:28:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:24.932 11:28:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:24.932 11:28:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:24.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:24.932 11:28:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:24.932 11:28:54 -- common/autotest_common.sh@10 -- # set +x 00:05:24.932 11:28:54 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:24.932 [2024-07-21 11:28:54.277030] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:24.932 [2024-07-21 11:28:54.277099] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2045434 ] 00:05:24.932 EAL: No free 2048 kB hugepages reported on node 1 00:05:24.932 [2024-07-21 11:28:54.344379] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.191 [2024-07-21 11:28:54.381921] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:25.191 [2024-07-21 11:28:54.382039] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:25.757 11:28:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:25.757 11:28:55 -- common/autotest_common.sh@852 -- # return 0 00:05:25.757 11:28:55 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=2045570 00:05:25.757 11:28:55 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 2045570 /var/tmp/spdk2.sock 00:05:25.757 11:28:55 -- common/autotest_common.sh@640 -- # local es=0 00:05:25.757 11:28:55 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2045570 /var/tmp/spdk2.sock 00:05:25.757 11:28:55 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:25.757 11:28:55 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:25.757 11:28:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:25.757 11:28:55 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:25.757 11:28:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:25.757 11:28:55 -- common/autotest_common.sh@643 -- # waitforlisten 2045570 /var/tmp/spdk2.sock 00:05:25.757 11:28:55 -- common/autotest_common.sh@819 -- # '[' -z 2045570 ']' 00:05:25.757 11:28:55 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:25.757 11:28:55 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:25.757 11:28:55 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:25.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:25.757 11:28:55 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:25.757 11:28:55 -- common/autotest_common.sh@10 -- # set +x 00:05:25.757 [2024-07-21 11:28:55.097903] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:25.757 [2024-07-21 11:28:55.097961] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2045570 ] 00:05:25.757 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.014 [2024-07-21 11:28:55.193199] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 2045434 has claimed it. 00:05:26.014 [2024-07-21 11:28:55.193236] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:26.579 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2045570) - No such process 00:05:26.579 ERROR: process (pid: 2045570) is no longer running 00:05:26.579 11:28:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:26.579 11:28:55 -- common/autotest_common.sh@852 -- # return 1 00:05:26.579 11:28:55 -- common/autotest_common.sh@643 -- # es=1 00:05:26.579 11:28:55 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:26.579 11:28:55 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:26.579 11:28:55 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:26.579 11:28:55 -- event/cpu_locks.sh@122 -- # locks_exist 2045434 00:05:26.579 11:28:55 -- event/cpu_locks.sh@22 -- # lslocks -p 2045434 00:05:26.579 11:28:55 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:26.579 lslocks: write error 00:05:26.579 11:28:55 -- event/cpu_locks.sh@124 -- # killprocess 2045434 00:05:26.579 11:28:55 -- common/autotest_common.sh@926 -- # '[' -z 2045434 ']' 00:05:26.579 11:28:55 -- common/autotest_common.sh@930 -- # kill -0 2045434 00:05:26.579 11:28:55 -- common/autotest_common.sh@931 -- # uname 00:05:26.579 11:28:56 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:26.837 11:28:56 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2045434 00:05:26.837 11:28:56 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:26.837 11:28:56 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:26.837 11:28:56 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2045434' 00:05:26.837 killing process with pid 2045434 00:05:26.837 11:28:56 -- common/autotest_common.sh@945 -- # kill 2045434 00:05:26.837 11:28:56 -- common/autotest_common.sh@950 -- # wait 2045434 00:05:27.094 00:05:27.094 real 0m2.087s 00:05:27.094 user 0m2.263s 00:05:27.094 sys 0m0.587s 00:05:27.094 11:28:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.094 11:28:56 -- common/autotest_common.sh@10 -- # set +x 00:05:27.094 ************************************ 00:05:27.094 END TEST locking_app_on_locked_coremask 00:05:27.094 ************************************ 00:05:27.094 11:28:56 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:27.094 11:28:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:27.094 11:28:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:27.094 11:28:56 -- common/autotest_common.sh@10 -- # set +x 00:05:27.094 ************************************ 00:05:27.094 START TEST locking_overlapped_coremask 00:05:27.094 ************************************ 00:05:27.094 11:28:56 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:05:27.094 11:28:56 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=2045757 00:05:27.094 11:28:56 -- event/cpu_locks.sh@133 -- # waitforlisten 2045757 /var/tmp/spdk.sock 00:05:27.094 11:28:56 -- common/autotest_common.sh@819 -- # '[' -z 2045757 ']' 00:05:27.094 11:28:56 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:27.094 11:28:56 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:27.094 11:28:56 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:27.094 11:28:56 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:27.094 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:27.094 11:28:56 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:27.094 11:28:56 -- common/autotest_common.sh@10 -- # set +x 00:05:27.094 [2024-07-21 11:28:56.414524] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:27.094 [2024-07-21 11:28:56.414619] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2045757 ] 00:05:27.094 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.094 [2024-07-21 11:28:56.483518] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:27.352 [2024-07-21 11:28:56.520355] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:27.352 [2024-07-21 11:28:56.520524] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:27.352 [2024-07-21 11:28:56.520618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:27.352 [2024-07-21 11:28:56.520621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.919 11:28:57 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:27.919 11:28:57 -- common/autotest_common.sh@852 -- # return 0 00:05:27.920 11:28:57 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=2046014 00:05:27.920 11:28:57 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 2046014 /var/tmp/spdk2.sock 00:05:27.920 11:28:57 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:27.920 11:28:57 -- common/autotest_common.sh@640 -- # local es=0 00:05:27.920 11:28:57 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2046014 /var/tmp/spdk2.sock 00:05:27.920 11:28:57 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:27.920 11:28:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:27.920 11:28:57 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:27.920 11:28:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:27.920 11:28:57 -- common/autotest_common.sh@643 -- # waitforlisten 2046014 /var/tmp/spdk2.sock 00:05:27.920 11:28:57 -- common/autotest_common.sh@819 -- # '[' -z 2046014 ']' 00:05:27.920 11:28:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:27.920 11:28:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:27.920 11:28:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:27.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:27.920 11:28:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:27.920 11:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:27.920 [2024-07-21 11:28:57.248678] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:27.920 [2024-07-21 11:28:57.248753] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046014 ] 00:05:27.920 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.178 [2024-07-21 11:28:57.345082] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2045757 has claimed it. 00:05:28.178 [2024-07-21 11:28:57.345124] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:28.759 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2046014) - No such process 00:05:28.759 ERROR: process (pid: 2046014) is no longer running 00:05:28.759 11:28:57 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:28.759 11:28:57 -- common/autotest_common.sh@852 -- # return 1 00:05:28.759 11:28:57 -- common/autotest_common.sh@643 -- # es=1 00:05:28.759 11:28:57 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:28.759 11:28:57 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:28.759 11:28:57 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:28.759 11:28:57 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:28.759 11:28:57 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:28.759 11:28:57 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:28.759 11:28:57 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:28.759 11:28:57 -- event/cpu_locks.sh@141 -- # killprocess 2045757 00:05:28.759 11:28:57 -- common/autotest_common.sh@926 -- # '[' -z 2045757 ']' 00:05:28.759 11:28:57 -- common/autotest_common.sh@930 -- # kill -0 2045757 00:05:28.759 11:28:57 -- common/autotest_common.sh@931 -- # uname 00:05:28.759 11:28:57 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:28.759 11:28:57 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2045757 00:05:28.759 11:28:57 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:28.759 11:28:57 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:28.759 11:28:57 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2045757' 00:05:28.759 killing process with pid 2045757 00:05:28.759 11:28:57 -- common/autotest_common.sh@945 -- # kill 2045757 00:05:28.759 11:28:57 -- common/autotest_common.sh@950 -- # wait 2045757 00:05:29.018 00:05:29.018 real 0m1.840s 00:05:29.018 user 0m5.261s 00:05:29.018 sys 0m0.421s 00:05:29.018 11:28:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.018 11:28:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.018 ************************************ 00:05:29.018 END TEST locking_overlapped_coremask 00:05:29.018 ************************************ 00:05:29.018 11:28:58 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:29.018 11:28:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.018 11:28:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.018 11:28:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.018 ************************************ 00:05:29.018 START TEST locking_overlapped_coremask_via_rpc 00:05:29.018 ************************************ 00:05:29.018 11:28:58 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:05:29.018 11:28:58 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=2046231 00:05:29.018 11:28:58 -- event/cpu_locks.sh@149 -- # waitforlisten 2046231 /var/tmp/spdk.sock 00:05:29.018 11:28:58 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:29.018 11:28:58 -- common/autotest_common.sh@819 -- # '[' -z 2046231 ']' 00:05:29.018 11:28:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:29.018 11:28:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:29.018 11:28:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:29.018 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:29.018 11:28:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:29.018 11:28:58 -- common/autotest_common.sh@10 -- # set +x 00:05:29.018 [2024-07-21 11:28:58.307055] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:29.018 [2024-07-21 11:28:58.307151] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046231 ] 00:05:29.018 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.018 [2024-07-21 11:28:58.377141] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:29.018 [2024-07-21 11:28:58.377173] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:29.018 [2024-07-21 11:28:58.415660] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:29.018 [2024-07-21 11:28:58.415860] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:29.018 [2024-07-21 11:28:58.415937] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:29.018 [2024-07-21 11:28:58.415938] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.955 11:28:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:29.955 11:28:59 -- common/autotest_common.sh@852 -- # return 0 00:05:29.955 11:28:59 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=2046318 00:05:29.955 11:28:59 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:29.955 11:28:59 -- event/cpu_locks.sh@153 -- # waitforlisten 2046318 /var/tmp/spdk2.sock 00:05:29.955 11:28:59 -- common/autotest_common.sh@819 -- # '[' -z 2046318 ']' 00:05:29.955 11:28:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:29.955 11:28:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:29.955 11:28:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:29.955 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:29.955 11:28:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:29.955 11:28:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.955 [2024-07-21 11:28:59.122369] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:29.955 [2024-07-21 11:28:59.122434] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046318 ] 00:05:29.955 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.955 [2024-07-21 11:28:59.215622] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:29.955 [2024-07-21 11:28:59.215652] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:29.956 [2024-07-21 11:28:59.288871] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:29.956 [2024-07-21 11:28:59.289047] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:29.956 [2024-07-21 11:28:59.292485] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:29.956 [2024-07-21 11:28:59.292486] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:30.522 11:28:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:30.522 11:28:59 -- common/autotest_common.sh@852 -- # return 0 00:05:30.522 11:28:59 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:30.522 11:28:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:30.522 11:28:59 -- common/autotest_common.sh@10 -- # set +x 00:05:30.780 11:28:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:30.780 11:28:59 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:30.780 11:28:59 -- common/autotest_common.sh@640 -- # local es=0 00:05:30.780 11:28:59 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:30.780 11:28:59 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:05:30.780 11:28:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:30.780 11:28:59 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:05:30.780 11:28:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:30.780 11:28:59 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:30.780 11:28:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:30.780 11:28:59 -- common/autotest_common.sh@10 -- # set +x 00:05:30.780 [2024-07-21 11:28:59.962503] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2046231 has claimed it. 00:05:30.780 request: 00:05:30.780 { 00:05:30.780 "method": "framework_enable_cpumask_locks", 00:05:30.780 "req_id": 1 00:05:30.780 } 00:05:30.780 Got JSON-RPC error response 00:05:30.780 response: 00:05:30.780 { 00:05:30.780 "code": -32603, 00:05:30.780 "message": "Failed to claim CPU core: 2" 00:05:30.780 } 00:05:30.780 11:28:59 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:05:30.780 11:28:59 -- common/autotest_common.sh@643 -- # es=1 00:05:30.780 11:28:59 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:30.780 11:28:59 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:30.780 11:28:59 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:30.780 11:28:59 -- event/cpu_locks.sh@158 -- # waitforlisten 2046231 /var/tmp/spdk.sock 00:05:30.780 11:28:59 -- common/autotest_common.sh@819 -- # '[' -z 2046231 ']' 00:05:30.780 11:28:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:30.780 11:28:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:30.780 11:28:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:30.780 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:30.780 11:28:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:30.780 11:28:59 -- common/autotest_common.sh@10 -- # set +x 00:05:30.780 11:29:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:30.780 11:29:00 -- common/autotest_common.sh@852 -- # return 0 00:05:30.780 11:29:00 -- event/cpu_locks.sh@159 -- # waitforlisten 2046318 /var/tmp/spdk2.sock 00:05:30.780 11:29:00 -- common/autotest_common.sh@819 -- # '[' -z 2046318 ']' 00:05:30.780 11:29:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:30.780 11:29:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:30.780 11:29:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:30.780 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:30.780 11:29:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:30.780 11:29:00 -- common/autotest_common.sh@10 -- # set +x 00:05:31.039 11:29:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:31.039 11:29:00 -- common/autotest_common.sh@852 -- # return 0 00:05:31.039 11:29:00 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:31.039 11:29:00 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:31.039 11:29:00 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:31.039 11:29:00 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:31.039 00:05:31.039 real 0m2.045s 00:05:31.039 user 0m0.799s 00:05:31.039 sys 0m0.183s 00:05:31.039 11:29:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.039 11:29:00 -- common/autotest_common.sh@10 -- # set +x 00:05:31.039 ************************************ 00:05:31.039 END TEST locking_overlapped_coremask_via_rpc 00:05:31.039 ************************************ 00:05:31.039 11:29:00 -- event/cpu_locks.sh@174 -- # cleanup 00:05:31.039 11:29:00 -- event/cpu_locks.sh@15 -- # [[ -z 2046231 ]] 00:05:31.039 11:29:00 -- event/cpu_locks.sh@15 -- # killprocess 2046231 00:05:31.039 11:29:00 -- common/autotest_common.sh@926 -- # '[' -z 2046231 ']' 00:05:31.039 11:29:00 -- common/autotest_common.sh@930 -- # kill -0 2046231 00:05:31.039 11:29:00 -- common/autotest_common.sh@931 -- # uname 00:05:31.039 11:29:00 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:31.039 11:29:00 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2046231 00:05:31.039 11:29:00 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:31.039 11:29:00 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:31.039 11:29:00 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2046231' 00:05:31.039 killing process with pid 2046231 00:05:31.039 11:29:00 -- common/autotest_common.sh@945 -- # kill 2046231 00:05:31.039 11:29:00 -- common/autotest_common.sh@950 -- # wait 2046231 00:05:31.604 11:29:00 -- event/cpu_locks.sh@16 -- # [[ -z 2046318 ]] 00:05:31.604 11:29:00 -- event/cpu_locks.sh@16 -- # killprocess 2046318 00:05:31.604 11:29:00 -- common/autotest_common.sh@926 -- # '[' -z 2046318 ']' 00:05:31.604 11:29:00 -- common/autotest_common.sh@930 -- # kill -0 2046318 00:05:31.604 11:29:00 -- common/autotest_common.sh@931 -- # uname 00:05:31.604 11:29:00 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:31.604 11:29:00 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2046318 00:05:31.604 11:29:00 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:31.604 11:29:00 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:31.604 11:29:00 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2046318' 00:05:31.604 killing process with pid 2046318 00:05:31.604 11:29:00 -- common/autotest_common.sh@945 -- # kill 2046318 00:05:31.604 11:29:00 -- common/autotest_common.sh@950 -- # wait 2046318 00:05:31.863 11:29:01 -- event/cpu_locks.sh@18 -- # rm -f 00:05:31.863 11:29:01 -- event/cpu_locks.sh@1 -- # cleanup 00:05:31.863 11:29:01 -- event/cpu_locks.sh@15 -- # [[ -z 2046231 ]] 00:05:31.863 11:29:01 -- event/cpu_locks.sh@15 -- # killprocess 2046231 00:05:31.863 11:29:01 -- common/autotest_common.sh@926 -- # '[' -z 2046231 ']' 00:05:31.863 11:29:01 -- common/autotest_common.sh@930 -- # kill -0 2046231 00:05:31.863 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2046231) - No such process 00:05:31.863 11:29:01 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2046231 is not found' 00:05:31.863 Process with pid 2046231 is not found 00:05:31.863 11:29:01 -- event/cpu_locks.sh@16 -- # [[ -z 2046318 ]] 00:05:31.863 11:29:01 -- event/cpu_locks.sh@16 -- # killprocess 2046318 00:05:31.863 11:29:01 -- common/autotest_common.sh@926 -- # '[' -z 2046318 ']' 00:05:31.863 11:29:01 -- common/autotest_common.sh@930 -- # kill -0 2046318 00:05:31.863 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2046318) - No such process 00:05:31.863 11:29:01 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2046318 is not found' 00:05:31.863 Process with pid 2046318 is not found 00:05:31.863 11:29:01 -- event/cpu_locks.sh@18 -- # rm -f 00:05:31.863 00:05:31.863 real 0m17.338s 00:05:31.863 user 0m29.531s 00:05:31.863 sys 0m5.602s 00:05:31.863 11:29:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.863 11:29:01 -- common/autotest_common.sh@10 -- # set +x 00:05:31.863 ************************************ 00:05:31.863 END TEST cpu_locks 00:05:31.863 ************************************ 00:05:31.863 00:05:31.863 real 0m41.831s 00:05:31.863 user 1m18.026s 00:05:31.863 sys 0m9.695s 00:05:31.863 11:29:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.863 11:29:01 -- common/autotest_common.sh@10 -- # set +x 00:05:31.863 ************************************ 00:05:31.863 END TEST event 00:05:31.863 ************************************ 00:05:31.863 11:29:01 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:31.863 11:29:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:31.863 11:29:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:31.863 11:29:01 -- common/autotest_common.sh@10 -- # set +x 00:05:31.863 ************************************ 00:05:31.863 START TEST thread 00:05:31.863 ************************************ 00:05:31.863 11:29:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:31.863 * Looking for test storage... 00:05:31.863 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:05:31.863 11:29:01 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:31.863 11:29:01 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:05:31.863 11:29:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:31.863 11:29:01 -- common/autotest_common.sh@10 -- # set +x 00:05:31.863 ************************************ 00:05:31.863 START TEST thread_poller_perf 00:05:31.863 ************************************ 00:05:31.864 11:29:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:32.122 [2024-07-21 11:29:01.297686] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:32.122 [2024-07-21 11:29:01.297786] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046828 ] 00:05:32.122 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.122 [2024-07-21 11:29:01.369298] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.122 [2024-07-21 11:29:01.405817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.122 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:33.057 ====================================== 00:05:33.057 busy:2504568230 (cyc) 00:05:33.057 total_run_count: 807000 00:05:33.057 tsc_hz: 2500000000 (cyc) 00:05:33.057 ====================================== 00:05:33.057 poller_cost: 3103 (cyc), 1241 (nsec) 00:05:33.057 00:05:33.057 real 0m1.183s 00:05:33.057 user 0m1.085s 00:05:33.057 sys 0m0.094s 00:05:33.057 11:29:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:33.057 11:29:02 -- common/autotest_common.sh@10 -- # set +x 00:05:33.057 ************************************ 00:05:33.057 END TEST thread_poller_perf 00:05:33.057 ************************************ 00:05:33.315 11:29:02 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:33.315 11:29:02 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:05:33.315 11:29:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:33.315 11:29:02 -- common/autotest_common.sh@10 -- # set +x 00:05:33.315 ************************************ 00:05:33.315 START TEST thread_poller_perf 00:05:33.315 ************************************ 00:05:33.315 11:29:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:33.315 [2024-07-21 11:29:02.527270] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:33.315 [2024-07-21 11:29:02.527416] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046979 ] 00:05:33.315 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.315 [2024-07-21 11:29:02.601166] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.315 [2024-07-21 11:29:02.637264] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.315 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:34.691 ====================================== 00:05:34.691 busy:2501946948 (cyc) 00:05:34.691 total_run_count: 13785000 00:05:34.691 tsc_hz: 2500000000 (cyc) 00:05:34.691 ====================================== 00:05:34.691 poller_cost: 181 (cyc), 72 (nsec) 00:05:34.691 00:05:34.691 real 0m1.185s 00:05:34.691 user 0m1.087s 00:05:34.691 sys 0m0.094s 00:05:34.691 11:29:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:34.691 11:29:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.691 ************************************ 00:05:34.691 END TEST thread_poller_perf 00:05:34.691 ************************************ 00:05:34.691 11:29:03 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:05:34.691 11:29:03 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:34.691 11:29:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:34.691 11:29:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:34.691 11:29:03 -- common/autotest_common.sh@10 -- # set +x 00:05:34.691 ************************************ 00:05:34.691 START TEST thread_spdk_lock 00:05:34.691 ************************************ 00:05:34.691 11:29:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:34.691 [2024-07-21 11:29:03.754876] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:34.691 [2024-07-21 11:29:03.754966] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047261 ] 00:05:34.691 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.691 [2024-07-21 11:29:03.825189] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:34.691 [2024-07-21 11:29:03.860592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:34.691 [2024-07-21 11:29:03.860596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.950 [2024-07-21 11:29:04.353810] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:34.950 [2024-07-21 11:29:04.353846] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:05:34.950 [2024-07-21 11:29:04.353856] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x133de80 00:05:34.950 [2024-07-21 11:29:04.354650] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:34.950 [2024-07-21 11:29:04.354754] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:34.950 [2024-07-21 11:29:04.354772] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:35.209 Starting test contend 00:05:35.209 Worker Delay Wait us Hold us Total us 00:05:35.209 0 3 179676 187748 367425 00:05:35.209 1 5 94361 288001 382363 00:05:35.209 PASS test contend 00:05:35.209 Starting test hold_by_poller 00:05:35.209 PASS test hold_by_poller 00:05:35.209 Starting test hold_by_message 00:05:35.209 PASS test hold_by_message 00:05:35.209 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:05:35.209 100014 assertions passed 00:05:35.209 0 assertions failed 00:05:35.209 00:05:35.209 real 0m0.668s 00:05:35.209 user 0m1.068s 00:05:35.209 sys 0m0.091s 00:05:35.209 11:29:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.209 11:29:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.209 ************************************ 00:05:35.209 END TEST thread_spdk_lock 00:05:35.209 ************************************ 00:05:35.209 00:05:35.209 real 0m3.272s 00:05:35.209 user 0m3.317s 00:05:35.209 sys 0m0.473s 00:05:35.209 11:29:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.209 11:29:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.209 ************************************ 00:05:35.209 END TEST thread 00:05:35.209 ************************************ 00:05:35.209 11:29:04 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:35.209 11:29:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.209 11:29:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.209 11:29:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.209 ************************************ 00:05:35.209 START TEST accel 00:05:35.209 ************************************ 00:05:35.209 11:29:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:35.209 * Looking for test storage... 00:05:35.209 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:05:35.209 11:29:04 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:05:35.209 11:29:04 -- accel/accel.sh@74 -- # get_expected_opcs 00:05:35.209 11:29:04 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:35.209 11:29:04 -- accel/accel.sh@59 -- # spdk_tgt_pid=2047582 00:05:35.209 11:29:04 -- accel/accel.sh@60 -- # waitforlisten 2047582 00:05:35.209 11:29:04 -- common/autotest_common.sh@819 -- # '[' -z 2047582 ']' 00:05:35.209 11:29:04 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.209 11:29:04 -- accel/accel.sh@58 -- # build_accel_config 00:05:35.209 11:29:04 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:35.209 11:29:04 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:35.209 11:29:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:35.209 11:29:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:35.209 11:29:04 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.209 11:29:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:35.209 11:29:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:35.209 11:29:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:35.209 11:29:04 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:35.209 11:29:04 -- accel/accel.sh@41 -- # local IFS=, 00:05:35.209 11:29:04 -- accel/accel.sh@42 -- # jq -r . 00:05:35.209 11:29:04 -- common/autotest_common.sh@10 -- # set +x 00:05:35.209 [2024-07-21 11:29:04.630402] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:35.209 [2024-07-21 11:29:04.630519] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047582 ] 00:05:35.468 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.468 [2024-07-21 11:29:04.699712] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.468 [2024-07-21 11:29:04.735603] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:35.468 [2024-07-21 11:29:04.735724] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.048 11:29:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:36.048 11:29:05 -- common/autotest_common.sh@852 -- # return 0 00:05:36.048 11:29:05 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:05:36.048 11:29:05 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:05:36.048 11:29:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.048 11:29:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.048 11:29:05 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:05:36.048 11:29:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.048 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.048 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.048 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.306 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.306 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.306 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.306 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.306 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.306 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.306 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.306 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.306 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.306 11:29:05 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # IFS== 00:05:36.306 11:29:05 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.307 11:29:05 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.307 11:29:05 -- accel/accel.sh@67 -- # killprocess 2047582 00:05:36.307 11:29:05 -- common/autotest_common.sh@926 -- # '[' -z 2047582 ']' 00:05:36.307 11:29:05 -- common/autotest_common.sh@930 -- # kill -0 2047582 00:05:36.307 11:29:05 -- common/autotest_common.sh@931 -- # uname 00:05:36.307 11:29:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:36.307 11:29:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2047582 00:05:36.307 11:29:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:36.307 11:29:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:36.307 11:29:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2047582' 00:05:36.307 killing process with pid 2047582 00:05:36.307 11:29:05 -- common/autotest_common.sh@945 -- # kill 2047582 00:05:36.307 11:29:05 -- common/autotest_common.sh@950 -- # wait 2047582 00:05:36.565 11:29:05 -- accel/accel.sh@68 -- # trap - ERR 00:05:36.565 11:29:05 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:05:36.565 11:29:05 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:05:36.565 11:29:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.565 11:29:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.565 11:29:05 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:05:36.565 11:29:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:05:36.565 11:29:05 -- accel/accel.sh@12 -- # build_accel_config 00:05:36.565 11:29:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:36.565 11:29:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:36.565 11:29:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:36.565 11:29:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:36.565 11:29:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:36.565 11:29:05 -- accel/accel.sh@41 -- # local IFS=, 00:05:36.565 11:29:05 -- accel/accel.sh@42 -- # jq -r . 00:05:36.565 11:29:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.565 11:29:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.565 11:29:05 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:05:36.565 11:29:05 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:36.565 11:29:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.565 11:29:05 -- common/autotest_common.sh@10 -- # set +x 00:05:36.565 ************************************ 00:05:36.565 START TEST accel_missing_filename 00:05:36.565 ************************************ 00:05:36.565 11:29:05 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:05:36.565 11:29:05 -- common/autotest_common.sh@640 -- # local es=0 00:05:36.565 11:29:05 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:05:36.565 11:29:05 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:05:36.565 11:29:05 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:36.565 11:29:05 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:05:36.565 11:29:05 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:36.565 11:29:05 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:05:36.565 11:29:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:05:36.565 11:29:05 -- accel/accel.sh@12 -- # build_accel_config 00:05:36.565 11:29:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:36.565 11:29:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:36.565 11:29:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:36.565 11:29:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:36.565 11:29:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:36.565 11:29:05 -- accel/accel.sh@41 -- # local IFS=, 00:05:36.565 11:29:05 -- accel/accel.sh@42 -- # jq -r . 00:05:36.565 [2024-07-21 11:29:05.922282] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:36.565 [2024-07-21 11:29:05.922391] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047764 ] 00:05:36.565 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.824 [2024-07-21 11:29:05.993742] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.824 [2024-07-21 11:29:06.029810] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.824 [2024-07-21 11:29:06.069549] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:36.824 [2024-07-21 11:29:06.129794] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:36.824 A filename is required. 00:05:36.824 11:29:06 -- common/autotest_common.sh@643 -- # es=234 00:05:36.824 11:29:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:36.824 11:29:06 -- common/autotest_common.sh@652 -- # es=106 00:05:36.824 11:29:06 -- common/autotest_common.sh@653 -- # case "$es" in 00:05:36.824 11:29:06 -- common/autotest_common.sh@660 -- # es=1 00:05:36.824 11:29:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:36.824 00:05:36.824 real 0m0.290s 00:05:36.824 user 0m0.189s 00:05:36.824 sys 0m0.137s 00:05:36.824 11:29:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.824 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:36.824 ************************************ 00:05:36.824 END TEST accel_missing_filename 00:05:36.824 ************************************ 00:05:36.824 11:29:06 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:36.824 11:29:06 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:05:36.824 11:29:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.824 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:36.824 ************************************ 00:05:36.824 START TEST accel_compress_verify 00:05:36.824 ************************************ 00:05:36.824 11:29:06 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:36.824 11:29:06 -- common/autotest_common.sh@640 -- # local es=0 00:05:36.824 11:29:06 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:36.824 11:29:06 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:05:36.824 11:29:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:36.824 11:29:06 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:05:36.824 11:29:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:36.824 11:29:06 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:36.824 11:29:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:36.824 11:29:06 -- accel/accel.sh@12 -- # build_accel_config 00:05:36.824 11:29:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:36.824 11:29:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:36.824 11:29:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:36.824 11:29:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:36.824 11:29:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:36.824 11:29:06 -- accel/accel.sh@41 -- # local IFS=, 00:05:36.824 11:29:06 -- accel/accel.sh@42 -- # jq -r . 00:05:37.083 [2024-07-21 11:29:06.262481] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:37.083 [2024-07-21 11:29:06.262581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047901 ] 00:05:37.083 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.083 [2024-07-21 11:29:06.335236] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.083 [2024-07-21 11:29:06.370869] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.083 [2024-07-21 11:29:06.410372] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:37.083 [2024-07-21 11:29:06.470194] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:37.342 00:05:37.342 Compression does not support the verify option, aborting. 00:05:37.342 11:29:06 -- common/autotest_common.sh@643 -- # es=161 00:05:37.342 11:29:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:37.342 11:29:06 -- common/autotest_common.sh@652 -- # es=33 00:05:37.342 11:29:06 -- common/autotest_common.sh@653 -- # case "$es" in 00:05:37.342 11:29:06 -- common/autotest_common.sh@660 -- # es=1 00:05:37.342 11:29:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:37.342 00:05:37.342 real 0m0.293s 00:05:37.342 user 0m0.194s 00:05:37.342 sys 0m0.138s 00:05:37.342 11:29:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.342 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.342 ************************************ 00:05:37.342 END TEST accel_compress_verify 00:05:37.342 ************************************ 00:05:37.342 11:29:06 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:05:37.342 11:29:06 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:37.342 11:29:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.342 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.342 ************************************ 00:05:37.342 START TEST accel_wrong_workload 00:05:37.342 ************************************ 00:05:37.342 11:29:06 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:05:37.342 11:29:06 -- common/autotest_common.sh@640 -- # local es=0 00:05:37.342 11:29:06 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:05:37.342 11:29:06 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:05:37.342 11:29:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:37.342 11:29:06 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:05:37.342 11:29:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:37.342 11:29:06 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:05:37.342 11:29:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:05:37.342 11:29:06 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.343 11:29:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.343 11:29:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.343 11:29:06 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.343 11:29:06 -- accel/accel.sh@42 -- # jq -r . 00:05:37.343 Unsupported workload type: foobar 00:05:37.343 [2024-07-21 11:29:06.599617] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:05:37.343 accel_perf options: 00:05:37.343 [-h help message] 00:05:37.343 [-q queue depth per core] 00:05:37.343 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:37.343 [-T number of threads per core 00:05:37.343 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:37.343 [-t time in seconds] 00:05:37.343 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:37.343 [ dif_verify, , dif_generate, dif_generate_copy 00:05:37.343 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:37.343 [-l for compress/decompress workloads, name of uncompressed input file 00:05:37.343 [-S for crc32c workload, use this seed value (default 0) 00:05:37.343 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:37.343 [-f for fill workload, use this BYTE value (default 255) 00:05:37.343 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:37.343 [-y verify result if this switch is on] 00:05:37.343 [-a tasks to allocate per core (default: same value as -q)] 00:05:37.343 Can be used to spread operations across a wider range of memory. 00:05:37.343 11:29:06 -- common/autotest_common.sh@643 -- # es=1 00:05:37.343 11:29:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:37.343 11:29:06 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:37.343 11:29:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:37.343 00:05:37.343 real 0m0.029s 00:05:37.343 user 0m0.014s 00:05:37.343 sys 0m0.015s 00:05:37.343 11:29:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.343 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.343 ************************************ 00:05:37.343 END TEST accel_wrong_workload 00:05:37.343 ************************************ 00:05:37.343 Error: writing output failed: Broken pipe 00:05:37.343 11:29:06 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:05:37.343 11:29:06 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:05:37.343 11:29:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.343 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.343 ************************************ 00:05:37.343 START TEST accel_negative_buffers 00:05:37.343 ************************************ 00:05:37.343 11:29:06 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:05:37.343 11:29:06 -- common/autotest_common.sh@640 -- # local es=0 00:05:37.343 11:29:06 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:05:37.343 11:29:06 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:05:37.343 11:29:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:37.343 11:29:06 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:05:37.343 11:29:06 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:37.343 11:29:06 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:05:37.343 11:29:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:05:37.343 11:29:06 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.343 11:29:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.343 11:29:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.343 11:29:06 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.343 11:29:06 -- accel/accel.sh@42 -- # jq -r . 00:05:37.343 -x option must be non-negative. 00:05:37.343 [2024-07-21 11:29:06.674141] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:05:37.343 accel_perf options: 00:05:37.343 [-h help message] 00:05:37.343 [-q queue depth per core] 00:05:37.343 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:37.343 [-T number of threads per core 00:05:37.343 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:37.343 [-t time in seconds] 00:05:37.343 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:37.343 [ dif_verify, , dif_generate, dif_generate_copy 00:05:37.343 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:37.343 [-l for compress/decompress workloads, name of uncompressed input file 00:05:37.343 [-S for crc32c workload, use this seed value (default 0) 00:05:37.343 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:37.343 [-f for fill workload, use this BYTE value (default 255) 00:05:37.343 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:37.343 [-y verify result if this switch is on] 00:05:37.343 [-a tasks to allocate per core (default: same value as -q)] 00:05:37.343 Can be used to spread operations across a wider range of memory. 00:05:37.343 11:29:06 -- common/autotest_common.sh@643 -- # es=1 00:05:37.343 11:29:06 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:37.343 11:29:06 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:37.343 11:29:06 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:37.343 00:05:37.343 real 0m0.028s 00:05:37.343 user 0m0.021s 00:05:37.343 sys 0m0.008s 00:05:37.343 11:29:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.343 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.343 ************************************ 00:05:37.343 END TEST accel_negative_buffers 00:05:37.343 ************************************ 00:05:37.343 Error: writing output failed: Broken pipe 00:05:37.343 11:29:06 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:05:37.343 11:29:06 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:05:37.343 11:29:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.343 11:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:37.343 ************************************ 00:05:37.343 START TEST accel_crc32c 00:05:37.343 ************************************ 00:05:37.343 11:29:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:05:37.343 11:29:06 -- accel/accel.sh@16 -- # local accel_opc 00:05:37.343 11:29:06 -- accel/accel.sh@17 -- # local accel_module 00:05:37.343 11:29:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:37.343 11:29:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:37.343 11:29:06 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.343 11:29:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.343 11:29:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.343 11:29:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.343 11:29:06 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.343 11:29:06 -- accel/accel.sh@42 -- # jq -r . 00:05:37.343 [2024-07-21 11:29:06.750924] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:37.343 [2024-07-21 11:29:06.751015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047970 ] 00:05:37.602 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.602 [2024-07-21 11:29:06.822314] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.602 [2024-07-21 11:29:06.860066] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.990 11:29:08 -- accel/accel.sh@18 -- # out=' 00:05:38.990 SPDK Configuration: 00:05:38.990 Core mask: 0x1 00:05:38.990 00:05:38.990 Accel Perf Configuration: 00:05:38.990 Workload Type: crc32c 00:05:38.990 CRC-32C seed: 32 00:05:38.990 Transfer size: 4096 bytes 00:05:38.990 Vector count 1 00:05:38.990 Module: software 00:05:38.990 Queue depth: 32 00:05:38.990 Allocate depth: 32 00:05:38.990 # threads/core: 1 00:05:38.990 Run time: 1 seconds 00:05:38.990 Verify: Yes 00:05:38.990 00:05:38.990 Running for 1 seconds... 00:05:38.990 00:05:38.990 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:38.990 ------------------------------------------------------------------------------------ 00:05:38.990 0,0 845984/s 3304 MiB/s 0 0 00:05:38.990 ==================================================================================== 00:05:38.990 Total 845984/s 3304 MiB/s 0 0' 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:38.990 11:29:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:38.990 11:29:08 -- accel/accel.sh@12 -- # build_accel_config 00:05:38.990 11:29:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:38.990 11:29:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:38.990 11:29:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:38.990 11:29:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:38.990 11:29:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:38.990 11:29:08 -- accel/accel.sh@41 -- # local IFS=, 00:05:38.990 11:29:08 -- accel/accel.sh@42 -- # jq -r . 00:05:38.990 [2024-07-21 11:29:08.042139] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:38.990 [2024-07-21 11:29:08.042235] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2048236 ] 00:05:38.990 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.990 [2024-07-21 11:29:08.111715] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.990 [2024-07-21 11:29:08.146063] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=0x1 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=crc32c 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=32 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=software 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@23 -- # accel_module=software 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=32 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=32 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=1 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val=Yes 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:38.990 11:29:08 -- accel/accel.sh@21 -- # val= 00:05:38.990 11:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:38.990 11:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@21 -- # val= 00:05:39.926 11:29:09 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # IFS=: 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@21 -- # val= 00:05:39.926 11:29:09 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # IFS=: 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@21 -- # val= 00:05:39.926 11:29:09 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # IFS=: 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@21 -- # val= 00:05:39.926 11:29:09 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # IFS=: 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@21 -- # val= 00:05:39.926 11:29:09 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # IFS=: 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@21 -- # val= 00:05:39.926 11:29:09 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # IFS=: 00:05:39.926 11:29:09 -- accel/accel.sh@20 -- # read -r var val 00:05:39.926 11:29:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:39.926 11:29:09 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:39.926 11:29:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:39.926 00:05:39.926 real 0m2.585s 00:05:39.926 user 0m2.326s 00:05:39.926 sys 0m0.268s 00:05:39.926 11:29:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.926 11:29:09 -- common/autotest_common.sh@10 -- # set +x 00:05:39.926 ************************************ 00:05:39.926 END TEST accel_crc32c 00:05:39.926 ************************************ 00:05:40.185 11:29:09 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:05:40.185 11:29:09 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:05:40.185 11:29:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:40.185 11:29:09 -- common/autotest_common.sh@10 -- # set +x 00:05:40.185 ************************************ 00:05:40.185 START TEST accel_crc32c_C2 00:05:40.185 ************************************ 00:05:40.185 11:29:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:05:40.185 11:29:09 -- accel/accel.sh@16 -- # local accel_opc 00:05:40.185 11:29:09 -- accel/accel.sh@17 -- # local accel_module 00:05:40.185 11:29:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:40.185 11:29:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:40.185 11:29:09 -- accel/accel.sh@12 -- # build_accel_config 00:05:40.185 11:29:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:40.185 11:29:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:40.185 11:29:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:40.185 11:29:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:40.185 11:29:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:40.185 11:29:09 -- accel/accel.sh@41 -- # local IFS=, 00:05:40.185 11:29:09 -- accel/accel.sh@42 -- # jq -r . 00:05:40.185 [2024-07-21 11:29:09.383245] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:40.185 [2024-07-21 11:29:09.383341] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2048519 ] 00:05:40.185 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.185 [2024-07-21 11:29:09.452829] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.185 [2024-07-21 11:29:09.488550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.564 11:29:10 -- accel/accel.sh@18 -- # out=' 00:05:41.564 SPDK Configuration: 00:05:41.564 Core mask: 0x1 00:05:41.564 00:05:41.564 Accel Perf Configuration: 00:05:41.564 Workload Type: crc32c 00:05:41.564 CRC-32C seed: 0 00:05:41.564 Transfer size: 4096 bytes 00:05:41.564 Vector count 2 00:05:41.564 Module: software 00:05:41.564 Queue depth: 32 00:05:41.564 Allocate depth: 32 00:05:41.564 # threads/core: 1 00:05:41.564 Run time: 1 seconds 00:05:41.564 Verify: Yes 00:05:41.564 00:05:41.564 Running for 1 seconds... 00:05:41.564 00:05:41.564 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:41.564 ------------------------------------------------------------------------------------ 00:05:41.564 0,0 619008/s 4836 MiB/s 0 0 00:05:41.564 ==================================================================================== 00:05:41.564 Total 619008/s 2418 MiB/s 0 0' 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:41.564 11:29:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:41.564 11:29:10 -- accel/accel.sh@12 -- # build_accel_config 00:05:41.564 11:29:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:41.564 11:29:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:41.564 11:29:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:41.564 11:29:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:41.564 11:29:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:41.564 11:29:10 -- accel/accel.sh@41 -- # local IFS=, 00:05:41.564 11:29:10 -- accel/accel.sh@42 -- # jq -r . 00:05:41.564 [2024-07-21 11:29:10.671198] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:41.564 [2024-07-21 11:29:10.671290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2048711 ] 00:05:41.564 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.564 [2024-07-21 11:29:10.740973] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.564 [2024-07-21 11:29:10.775632] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=0x1 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=crc32c 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=0 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=software 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@23 -- # accel_module=software 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=32 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=32 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=1 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val=Yes 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:41.564 11:29:10 -- accel/accel.sh@21 -- # val= 00:05:41.564 11:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:41.564 11:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@21 -- # val= 00:05:42.941 11:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@21 -- # val= 00:05:42.941 11:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@21 -- # val= 00:05:42.941 11:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@21 -- # val= 00:05:42.941 11:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@21 -- # val= 00:05:42.941 11:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@21 -- # val= 00:05:42.941 11:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:42.941 11:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:42.941 11:29:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:42.941 11:29:11 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:42.941 11:29:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:42.941 00:05:42.941 real 0m2.581s 00:05:42.941 user 0m2.330s 00:05:42.941 sys 0m0.260s 00:05:42.941 11:29:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.941 11:29:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.941 ************************************ 00:05:42.941 END TEST accel_crc32c_C2 00:05:42.941 ************************************ 00:05:42.941 11:29:11 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:05:42.941 11:29:11 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:42.941 11:29:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.941 11:29:11 -- common/autotest_common.sh@10 -- # set +x 00:05:42.941 ************************************ 00:05:42.941 START TEST accel_copy 00:05:42.941 ************************************ 00:05:42.941 11:29:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:05:42.941 11:29:11 -- accel/accel.sh@16 -- # local accel_opc 00:05:42.941 11:29:11 -- accel/accel.sh@17 -- # local accel_module 00:05:42.941 11:29:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:05:42.942 11:29:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:42.942 11:29:11 -- accel/accel.sh@12 -- # build_accel_config 00:05:42.942 11:29:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:42.942 11:29:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:42.942 11:29:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:42.942 11:29:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:42.942 11:29:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:42.942 11:29:11 -- accel/accel.sh@41 -- # local IFS=, 00:05:42.942 11:29:11 -- accel/accel.sh@42 -- # jq -r . 00:05:42.942 [2024-07-21 11:29:12.014177] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:42.942 [2024-07-21 11:29:12.014268] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2048897 ] 00:05:42.942 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.942 [2024-07-21 11:29:12.084527] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.942 [2024-07-21 11:29:12.119863] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.878 11:29:13 -- accel/accel.sh@18 -- # out=' 00:05:43.878 SPDK Configuration: 00:05:43.878 Core mask: 0x1 00:05:43.878 00:05:43.878 Accel Perf Configuration: 00:05:43.878 Workload Type: copy 00:05:43.878 Transfer size: 4096 bytes 00:05:43.878 Vector count 1 00:05:43.878 Module: software 00:05:43.878 Queue depth: 32 00:05:43.878 Allocate depth: 32 00:05:43.878 # threads/core: 1 00:05:43.878 Run time: 1 seconds 00:05:43.878 Verify: Yes 00:05:43.878 00:05:43.878 Running for 1 seconds... 00:05:43.878 00:05:43.878 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:43.878 ------------------------------------------------------------------------------------ 00:05:43.878 0,0 557184/s 2176 MiB/s 0 0 00:05:43.878 ==================================================================================== 00:05:43.878 Total 557184/s 2176 MiB/s 0 0' 00:05:43.878 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:43.878 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:43.878 11:29:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:05:43.878 11:29:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:43.878 11:29:13 -- accel/accel.sh@12 -- # build_accel_config 00:05:43.878 11:29:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:43.878 11:29:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:43.878 11:29:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:43.878 11:29:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:43.878 11:29:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:43.878 11:29:13 -- accel/accel.sh@41 -- # local IFS=, 00:05:43.878 11:29:13 -- accel/accel.sh@42 -- # jq -r . 00:05:43.878 [2024-07-21 11:29:13.299843] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:43.878 [2024-07-21 11:29:13.299933] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2049095 ] 00:05:44.136 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.136 [2024-07-21 11:29:13.368507] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.136 [2024-07-21 11:29:13.403061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val=0x1 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val=copy 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@24 -- # accel_opc=copy 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.136 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.136 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.136 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val=software 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@23 -- # accel_module=software 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val=32 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val=32 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val=1 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val=Yes 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:44.137 11:29:13 -- accel/accel.sh@21 -- # val= 00:05:44.137 11:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:44.137 11:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:45.510 11:29:14 -- accel/accel.sh@21 -- # val= 00:05:45.510 11:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.510 11:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:45.510 11:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:45.510 11:29:14 -- accel/accel.sh@21 -- # val= 00:05:45.510 11:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.510 11:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:45.510 11:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:45.510 11:29:14 -- accel/accel.sh@21 -- # val= 00:05:45.511 11:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:45.511 11:29:14 -- accel/accel.sh@21 -- # val= 00:05:45.511 11:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:45.511 11:29:14 -- accel/accel.sh@21 -- # val= 00:05:45.511 11:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:45.511 11:29:14 -- accel/accel.sh@21 -- # val= 00:05:45.511 11:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:45.511 11:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:45.511 11:29:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:45.511 11:29:14 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:05:45.511 11:29:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:45.511 00:05:45.511 real 0m2.577s 00:05:45.511 user 0m2.327s 00:05:45.511 sys 0m0.257s 00:05:45.511 11:29:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.511 11:29:14 -- common/autotest_common.sh@10 -- # set +x 00:05:45.511 ************************************ 00:05:45.511 END TEST accel_copy 00:05:45.511 ************************************ 00:05:45.511 11:29:14 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:45.511 11:29:14 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:05:45.511 11:29:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.511 11:29:14 -- common/autotest_common.sh@10 -- # set +x 00:05:45.511 ************************************ 00:05:45.511 START TEST accel_fill 00:05:45.511 ************************************ 00:05:45.511 11:29:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:45.511 11:29:14 -- accel/accel.sh@16 -- # local accel_opc 00:05:45.511 11:29:14 -- accel/accel.sh@17 -- # local accel_module 00:05:45.511 11:29:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:45.511 11:29:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:45.511 11:29:14 -- accel/accel.sh@12 -- # build_accel_config 00:05:45.511 11:29:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:45.511 11:29:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:45.511 11:29:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:45.511 11:29:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:45.511 11:29:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:45.511 11:29:14 -- accel/accel.sh@41 -- # local IFS=, 00:05:45.511 11:29:14 -- accel/accel.sh@42 -- # jq -r . 00:05:45.511 [2024-07-21 11:29:14.640205] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:45.511 [2024-07-21 11:29:14.640298] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2049387 ] 00:05:45.511 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.511 [2024-07-21 11:29:14.710674] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.511 [2024-07-21 11:29:14.745762] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.886 11:29:15 -- accel/accel.sh@18 -- # out=' 00:05:46.886 SPDK Configuration: 00:05:46.886 Core mask: 0x1 00:05:46.886 00:05:46.886 Accel Perf Configuration: 00:05:46.886 Workload Type: fill 00:05:46.886 Fill pattern: 0x80 00:05:46.886 Transfer size: 4096 bytes 00:05:46.886 Vector count 1 00:05:46.886 Module: software 00:05:46.886 Queue depth: 64 00:05:46.886 Allocate depth: 64 00:05:46.886 # threads/core: 1 00:05:46.886 Run time: 1 seconds 00:05:46.886 Verify: Yes 00:05:46.886 00:05:46.886 Running for 1 seconds... 00:05:46.886 00:05:46.886 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:46.886 ------------------------------------------------------------------------------------ 00:05:46.886 0,0 952448/s 3720 MiB/s 0 0 00:05:46.886 ==================================================================================== 00:05:46.886 Total 952448/s 3720 MiB/s 0 0' 00:05:46.886 11:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:46.886 11:29:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:46.886 11:29:15 -- accel/accel.sh@12 -- # build_accel_config 00:05:46.886 11:29:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:46.886 11:29:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:46.886 11:29:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:46.886 11:29:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:46.886 11:29:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:46.886 11:29:15 -- accel/accel.sh@41 -- # local IFS=, 00:05:46.886 11:29:15 -- accel/accel.sh@42 -- # jq -r . 00:05:46.886 [2024-07-21 11:29:15.925939] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:46.886 [2024-07-21 11:29:15.926029] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2049653 ] 00:05:46.886 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.886 [2024-07-21 11:29:15.993871] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.886 [2024-07-21 11:29:16.027838] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=0x1 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=fill 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@24 -- # accel_opc=fill 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=0x80 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=software 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@23 -- # accel_module=software 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=64 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=64 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=1 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val=Yes 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:46.886 11:29:16 -- accel/accel.sh@21 -- # val= 00:05:46.886 11:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:46.886 11:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:47.819 11:29:17 -- accel/accel.sh@21 -- # val= 00:05:47.820 11:29:17 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # IFS=: 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # read -r var val 00:05:47.820 11:29:17 -- accel/accel.sh@21 -- # val= 00:05:47.820 11:29:17 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # IFS=: 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # read -r var val 00:05:47.820 11:29:17 -- accel/accel.sh@21 -- # val= 00:05:47.820 11:29:17 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # IFS=: 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # read -r var val 00:05:47.820 11:29:17 -- accel/accel.sh@21 -- # val= 00:05:47.820 11:29:17 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # IFS=: 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # read -r var val 00:05:47.820 11:29:17 -- accel/accel.sh@21 -- # val= 00:05:47.820 11:29:17 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # IFS=: 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # read -r var val 00:05:47.820 11:29:17 -- accel/accel.sh@21 -- # val= 00:05:47.820 11:29:17 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # IFS=: 00:05:47.820 11:29:17 -- accel/accel.sh@20 -- # read -r var val 00:05:47.820 11:29:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:47.820 11:29:17 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:05:47.820 11:29:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:47.820 00:05:47.820 real 0m2.576s 00:05:47.820 user 0m2.335s 00:05:47.820 sys 0m0.249s 00:05:47.820 11:29:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.820 11:29:17 -- common/autotest_common.sh@10 -- # set +x 00:05:47.820 ************************************ 00:05:47.820 END TEST accel_fill 00:05:47.820 ************************************ 00:05:47.820 11:29:17 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:05:47.820 11:29:17 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:47.820 11:29:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:47.820 11:29:17 -- common/autotest_common.sh@10 -- # set +x 00:05:47.820 ************************************ 00:05:47.820 START TEST accel_copy_crc32c 00:05:47.820 ************************************ 00:05:47.820 11:29:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:05:47.820 11:29:17 -- accel/accel.sh@16 -- # local accel_opc 00:05:48.078 11:29:17 -- accel/accel.sh@17 -- # local accel_module 00:05:48.078 11:29:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:48.078 11:29:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:48.078 11:29:17 -- accel/accel.sh@12 -- # build_accel_config 00:05:48.078 11:29:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:48.078 11:29:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:48.078 11:29:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:48.078 11:29:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:48.078 11:29:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:48.078 11:29:17 -- accel/accel.sh@41 -- # local IFS=, 00:05:48.078 11:29:17 -- accel/accel.sh@42 -- # jq -r . 00:05:48.078 [2024-07-21 11:29:17.264894] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:48.078 [2024-07-21 11:29:17.264977] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2049934 ] 00:05:48.078 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.078 [2024-07-21 11:29:17.334439] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.078 [2024-07-21 11:29:17.369719] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.455 11:29:18 -- accel/accel.sh@18 -- # out=' 00:05:49.455 SPDK Configuration: 00:05:49.455 Core mask: 0x1 00:05:49.455 00:05:49.455 Accel Perf Configuration: 00:05:49.455 Workload Type: copy_crc32c 00:05:49.455 CRC-32C seed: 0 00:05:49.455 Vector size: 4096 bytes 00:05:49.455 Transfer size: 4096 bytes 00:05:49.455 Vector count 1 00:05:49.455 Module: software 00:05:49.455 Queue depth: 32 00:05:49.455 Allocate depth: 32 00:05:49.455 # threads/core: 1 00:05:49.455 Run time: 1 seconds 00:05:49.455 Verify: Yes 00:05:49.455 00:05:49.455 Running for 1 seconds... 00:05:49.455 00:05:49.455 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:49.455 ------------------------------------------------------------------------------------ 00:05:49.455 0,0 415744/s 1624 MiB/s 0 0 00:05:49.455 ==================================================================================== 00:05:49.455 Total 415744/s 1624 MiB/s 0 0' 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:49.455 11:29:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:49.455 11:29:18 -- accel/accel.sh@12 -- # build_accel_config 00:05:49.455 11:29:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:49.455 11:29:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:49.455 11:29:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:49.455 11:29:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:49.455 11:29:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:49.455 11:29:18 -- accel/accel.sh@41 -- # local IFS=, 00:05:49.455 11:29:18 -- accel/accel.sh@42 -- # jq -r . 00:05:49.455 [2024-07-21 11:29:18.551561] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:49.455 [2024-07-21 11:29:18.551652] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2050202 ] 00:05:49.455 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.455 [2024-07-21 11:29:18.621652] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.455 [2024-07-21 11:29:18.656210] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=0x1 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=copy_crc32c 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=0 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=software 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@23 -- # accel_module=software 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=32 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=32 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=1 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.455 11:29:18 -- accel/accel.sh@21 -- # val=Yes 00:05:49.455 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.455 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.456 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.456 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.456 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.456 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:49.456 11:29:18 -- accel/accel.sh@21 -- # val= 00:05:49.456 11:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.456 11:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:49.456 11:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@21 -- # val= 00:05:50.442 11:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@21 -- # val= 00:05:50.442 11:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@21 -- # val= 00:05:50.442 11:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@21 -- # val= 00:05:50.442 11:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@21 -- # val= 00:05:50.442 11:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@21 -- # val= 00:05:50.442 11:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:50.442 11:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:50.442 11:29:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:50.442 11:29:19 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:05:50.442 11:29:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:50.442 00:05:50.442 real 0m2.580s 00:05:50.442 user 0m2.327s 00:05:50.442 sys 0m0.262s 00:05:50.442 11:29:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.442 11:29:19 -- common/autotest_common.sh@10 -- # set +x 00:05:50.442 ************************************ 00:05:50.442 END TEST accel_copy_crc32c 00:05:50.442 ************************************ 00:05:50.442 11:29:19 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:05:50.442 11:29:19 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:05:50.442 11:29:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:50.701 11:29:19 -- common/autotest_common.sh@10 -- # set +x 00:05:50.701 ************************************ 00:05:50.701 START TEST accel_copy_crc32c_C2 00:05:50.701 ************************************ 00:05:50.701 11:29:19 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:05:50.701 11:29:19 -- accel/accel.sh@16 -- # local accel_opc 00:05:50.701 11:29:19 -- accel/accel.sh@17 -- # local accel_module 00:05:50.701 11:29:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:50.701 11:29:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:50.701 11:29:19 -- accel/accel.sh@12 -- # build_accel_config 00:05:50.701 11:29:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:50.701 11:29:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:50.701 11:29:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:50.701 11:29:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:50.701 11:29:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:50.701 11:29:19 -- accel/accel.sh@41 -- # local IFS=, 00:05:50.701 11:29:19 -- accel/accel.sh@42 -- # jq -r . 00:05:50.701 [2024-07-21 11:29:19.894456] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:50.701 [2024-07-21 11:29:19.894546] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2050396 ] 00:05:50.701 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.701 [2024-07-21 11:29:19.965703] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.701 [2024-07-21 11:29:20.001363] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.081 11:29:21 -- accel/accel.sh@18 -- # out=' 00:05:52.081 SPDK Configuration: 00:05:52.081 Core mask: 0x1 00:05:52.081 00:05:52.081 Accel Perf Configuration: 00:05:52.081 Workload Type: copy_crc32c 00:05:52.081 CRC-32C seed: 0 00:05:52.081 Vector size: 4096 bytes 00:05:52.081 Transfer size: 8192 bytes 00:05:52.081 Vector count 2 00:05:52.081 Module: software 00:05:52.081 Queue depth: 32 00:05:52.081 Allocate depth: 32 00:05:52.081 # threads/core: 1 00:05:52.081 Run time: 1 seconds 00:05:52.081 Verify: Yes 00:05:52.081 00:05:52.081 Running for 1 seconds... 00:05:52.081 00:05:52.081 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:52.081 ------------------------------------------------------------------------------------ 00:05:52.081 0,0 293344/s 2291 MiB/s 0 0 00:05:52.081 ==================================================================================== 00:05:52.081 Total 293344/s 1145 MiB/s 0 0' 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:52.081 11:29:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:52.081 11:29:21 -- accel/accel.sh@12 -- # build_accel_config 00:05:52.081 11:29:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:52.081 11:29:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:52.081 11:29:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:52.081 11:29:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:52.081 11:29:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:52.081 11:29:21 -- accel/accel.sh@41 -- # local IFS=, 00:05:52.081 11:29:21 -- accel/accel.sh@42 -- # jq -r . 00:05:52.081 [2024-07-21 11:29:21.185970] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:52.081 [2024-07-21 11:29:21.186055] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2050538 ] 00:05:52.081 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.081 [2024-07-21 11:29:21.255141] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.081 [2024-07-21 11:29:21.289502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=0x1 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=copy_crc32c 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=0 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val='8192 bytes' 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=software 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@23 -- # accel_module=software 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=32 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=32 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=1 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val=Yes 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:52.081 11:29:21 -- accel/accel.sh@21 -- # val= 00:05:52.081 11:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:52.081 11:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@21 -- # val= 00:05:53.455 11:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@21 -- # val= 00:05:53.455 11:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@21 -- # val= 00:05:53.455 11:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@21 -- # val= 00:05:53.455 11:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@21 -- # val= 00:05:53.455 11:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@21 -- # val= 00:05:53.455 11:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:53.455 11:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:53.455 11:29:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:53.455 11:29:22 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:05:53.455 11:29:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:53.455 00:05:53.455 real 0m2.584s 00:05:53.455 user 0m2.335s 00:05:53.455 sys 0m0.259s 00:05:53.455 11:29:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:53.455 11:29:22 -- common/autotest_common.sh@10 -- # set +x 00:05:53.455 ************************************ 00:05:53.455 END TEST accel_copy_crc32c_C2 00:05:53.455 ************************************ 00:05:53.455 11:29:22 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:05:53.455 11:29:22 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:53.455 11:29:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:53.455 11:29:22 -- common/autotest_common.sh@10 -- # set +x 00:05:53.455 ************************************ 00:05:53.455 START TEST accel_dualcast 00:05:53.455 ************************************ 00:05:53.455 11:29:22 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:05:53.455 11:29:22 -- accel/accel.sh@16 -- # local accel_opc 00:05:53.455 11:29:22 -- accel/accel.sh@17 -- # local accel_module 00:05:53.455 11:29:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:05:53.455 11:29:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:53.455 11:29:22 -- accel/accel.sh@12 -- # build_accel_config 00:05:53.455 11:29:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:53.455 11:29:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:53.455 11:29:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:53.455 11:29:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:53.455 11:29:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:53.455 11:29:22 -- accel/accel.sh@41 -- # local IFS=, 00:05:53.455 11:29:22 -- accel/accel.sh@42 -- # jq -r . 00:05:53.455 [2024-07-21 11:29:22.525843] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:53.455 [2024-07-21 11:29:22.525935] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2050799 ] 00:05:53.455 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.455 [2024-07-21 11:29:22.596204] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.455 [2024-07-21 11:29:22.631213] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.390 11:29:23 -- accel/accel.sh@18 -- # out=' 00:05:54.390 SPDK Configuration: 00:05:54.390 Core mask: 0x1 00:05:54.390 00:05:54.390 Accel Perf Configuration: 00:05:54.390 Workload Type: dualcast 00:05:54.390 Transfer size: 4096 bytes 00:05:54.390 Vector count 1 00:05:54.390 Module: software 00:05:54.390 Queue depth: 32 00:05:54.390 Allocate depth: 32 00:05:54.390 # threads/core: 1 00:05:54.390 Run time: 1 seconds 00:05:54.390 Verify: Yes 00:05:54.390 00:05:54.390 Running for 1 seconds... 00:05:54.390 00:05:54.390 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:54.390 ------------------------------------------------------------------------------------ 00:05:54.390 0,0 661824/s 2585 MiB/s 0 0 00:05:54.390 ==================================================================================== 00:05:54.390 Total 661824/s 2585 MiB/s 0 0' 00:05:54.390 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.390 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.390 11:29:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:05:54.390 11:29:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:54.390 11:29:23 -- accel/accel.sh@12 -- # build_accel_config 00:05:54.390 11:29:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:54.390 11:29:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:54.390 11:29:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:54.390 11:29:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:54.390 11:29:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:54.390 11:29:23 -- accel/accel.sh@41 -- # local IFS=, 00:05:54.390 11:29:23 -- accel/accel.sh@42 -- # jq -r . 00:05:54.390 [2024-07-21 11:29:23.811702] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:54.390 [2024-07-21 11:29:23.811792] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2051073 ] 00:05:54.648 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.648 [2024-07-21 11:29:23.880457] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.648 [2024-07-21 11:29:23.914637] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=0x1 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=dualcast 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=software 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@23 -- # accel_module=software 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=32 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=32 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=1 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val=Yes 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:54.648 11:29:23 -- accel/accel.sh@21 -- # val= 00:05:54.648 11:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # IFS=: 00:05:54.648 11:29:23 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@21 -- # val= 00:05:56.021 11:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # IFS=: 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@21 -- # val= 00:05:56.021 11:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # IFS=: 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@21 -- # val= 00:05:56.021 11:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # IFS=: 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@21 -- # val= 00:05:56.021 11:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # IFS=: 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@21 -- # val= 00:05:56.021 11:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # IFS=: 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@21 -- # val= 00:05:56.021 11:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # IFS=: 00:05:56.021 11:29:25 -- accel/accel.sh@20 -- # read -r var val 00:05:56.021 11:29:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:56.021 11:29:25 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:05:56.021 11:29:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:56.021 00:05:56.021 real 0m2.577s 00:05:56.021 user 0m2.328s 00:05:56.021 sys 0m0.256s 00:05:56.021 11:29:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.021 11:29:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.021 ************************************ 00:05:56.021 END TEST accel_dualcast 00:05:56.021 ************************************ 00:05:56.021 11:29:25 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:05:56.021 11:29:25 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:56.021 11:29:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:56.021 11:29:25 -- common/autotest_common.sh@10 -- # set +x 00:05:56.021 ************************************ 00:05:56.021 START TEST accel_compare 00:05:56.021 ************************************ 00:05:56.021 11:29:25 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:05:56.021 11:29:25 -- accel/accel.sh@16 -- # local accel_opc 00:05:56.021 11:29:25 -- accel/accel.sh@17 -- # local accel_module 00:05:56.021 11:29:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:05:56.021 11:29:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:56.021 11:29:25 -- accel/accel.sh@12 -- # build_accel_config 00:05:56.021 11:29:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:56.021 11:29:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:56.021 11:29:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:56.021 11:29:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:56.021 11:29:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:56.021 11:29:25 -- accel/accel.sh@41 -- # local IFS=, 00:05:56.021 11:29:25 -- accel/accel.sh@42 -- # jq -r . 00:05:56.021 [2024-07-21 11:29:25.152144] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:56.021 [2024-07-21 11:29:25.152236] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2051354 ] 00:05:56.021 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.021 [2024-07-21 11:29:25.222920] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.021 [2024-07-21 11:29:25.258055] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.397 11:29:26 -- accel/accel.sh@18 -- # out=' 00:05:57.397 SPDK Configuration: 00:05:57.397 Core mask: 0x1 00:05:57.397 00:05:57.397 Accel Perf Configuration: 00:05:57.397 Workload Type: compare 00:05:57.397 Transfer size: 4096 bytes 00:05:57.397 Vector count 1 00:05:57.397 Module: software 00:05:57.397 Queue depth: 32 00:05:57.397 Allocate depth: 32 00:05:57.397 # threads/core: 1 00:05:57.397 Run time: 1 seconds 00:05:57.397 Verify: Yes 00:05:57.397 00:05:57.397 Running for 1 seconds... 00:05:57.397 00:05:57.397 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:57.397 ------------------------------------------------------------------------------------ 00:05:57.397 0,0 819968/s 3203 MiB/s 0 0 00:05:57.397 ==================================================================================== 00:05:57.397 Total 819968/s 3203 MiB/s 0 0' 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:05:57.397 11:29:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:57.397 11:29:26 -- accel/accel.sh@12 -- # build_accel_config 00:05:57.397 11:29:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:57.397 11:29:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:57.397 11:29:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:57.397 11:29:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:57.397 11:29:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:57.397 11:29:26 -- accel/accel.sh@41 -- # local IFS=, 00:05:57.397 11:29:26 -- accel/accel.sh@42 -- # jq -r . 00:05:57.397 [2024-07-21 11:29:26.437737] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:57.397 [2024-07-21 11:29:26.437828] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2051620 ] 00:05:57.397 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.397 [2024-07-21 11:29:26.506612] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.397 [2024-07-21 11:29:26.540119] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=0x1 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=compare 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@24 -- # accel_opc=compare 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=software 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@23 -- # accel_module=software 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=32 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=32 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=1 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val=Yes 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:57.397 11:29:26 -- accel/accel.sh@21 -- # val= 00:05:57.397 11:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # IFS=: 00:05:57.397 11:29:26 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@21 -- # val= 00:05:58.334 11:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@21 -- # val= 00:05:58.334 11:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@21 -- # val= 00:05:58.334 11:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@21 -- # val= 00:05:58.334 11:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@21 -- # val= 00:05:58.334 11:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@21 -- # val= 00:05:58.334 11:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # IFS=: 00:05:58.334 11:29:27 -- accel/accel.sh@20 -- # read -r var val 00:05:58.334 11:29:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:58.334 11:29:27 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:05:58.334 11:29:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:58.334 00:05:58.334 real 0m2.576s 00:05:58.334 user 0m2.322s 00:05:58.334 sys 0m0.263s 00:05:58.334 11:29:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:58.334 11:29:27 -- common/autotest_common.sh@10 -- # set +x 00:05:58.334 ************************************ 00:05:58.334 END TEST accel_compare 00:05:58.334 ************************************ 00:05:58.334 11:29:27 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:05:58.334 11:29:27 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:05:58.334 11:29:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:58.334 11:29:27 -- common/autotest_common.sh@10 -- # set +x 00:05:58.334 ************************************ 00:05:58.334 START TEST accel_xor 00:05:58.334 ************************************ 00:05:58.334 11:29:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:05:58.334 11:29:27 -- accel/accel.sh@16 -- # local accel_opc 00:05:58.334 11:29:27 -- accel/accel.sh@17 -- # local accel_module 00:05:58.334 11:29:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:05:58.334 11:29:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:05:58.593 11:29:27 -- accel/accel.sh@12 -- # build_accel_config 00:05:58.593 11:29:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:58.593 11:29:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:58.593 11:29:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:58.593 11:29:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:58.593 11:29:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:58.593 11:29:27 -- accel/accel.sh@41 -- # local IFS=, 00:05:58.593 11:29:27 -- accel/accel.sh@42 -- # jq -r . 00:05:58.593 [2024-07-21 11:29:27.774922] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:58.593 [2024-07-21 11:29:27.775025] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2051902 ] 00:05:58.593 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.593 [2024-07-21 11:29:27.843023] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.593 [2024-07-21 11:29:27.878324] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.971 11:29:29 -- accel/accel.sh@18 -- # out=' 00:05:59.971 SPDK Configuration: 00:05:59.971 Core mask: 0x1 00:05:59.971 00:05:59.971 Accel Perf Configuration: 00:05:59.971 Workload Type: xor 00:05:59.971 Source buffers: 2 00:05:59.971 Transfer size: 4096 bytes 00:05:59.971 Vector count 1 00:05:59.971 Module: software 00:05:59.971 Queue depth: 32 00:05:59.971 Allocate depth: 32 00:05:59.971 # threads/core: 1 00:05:59.971 Run time: 1 seconds 00:05:59.971 Verify: Yes 00:05:59.971 00:05:59.971 Running for 1 seconds... 00:05:59.971 00:05:59.971 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:59.971 ------------------------------------------------------------------------------------ 00:05:59.971 0,0 688352/s 2688 MiB/s 0 0 00:05:59.971 ==================================================================================== 00:05:59.971 Total 688352/s 2688 MiB/s 0 0' 00:05:59.971 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.971 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.971 11:29:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:05:59.971 11:29:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:05:59.971 11:29:29 -- accel/accel.sh@12 -- # build_accel_config 00:05:59.971 11:29:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:59.971 11:29:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:59.971 11:29:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:59.971 11:29:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:59.971 11:29:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:59.971 11:29:29 -- accel/accel.sh@41 -- # local IFS=, 00:05:59.971 11:29:29 -- accel/accel.sh@42 -- # jq -r . 00:05:59.971 [2024-07-21 11:29:29.059039] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:59.971 [2024-07-21 11:29:29.059122] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2052049 ] 00:05:59.971 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.971 [2024-07-21 11:29:29.127728] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.971 [2024-07-21 11:29:29.161881] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.971 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.971 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.971 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.971 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.971 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.971 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.971 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=0x1 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=xor 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@24 -- # accel_opc=xor 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=2 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=software 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@23 -- # accel_module=software 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=32 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=32 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=1 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val=Yes 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:05:59.972 11:29:29 -- accel/accel.sh@21 -- # val= 00:05:59.972 11:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # IFS=: 00:05:59.972 11:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@21 -- # val= 00:06:00.907 11:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@21 -- # val= 00:06:00.907 11:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@21 -- # val= 00:06:00.907 11:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@21 -- # val= 00:06:00.907 11:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@21 -- # val= 00:06:00.907 11:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@21 -- # val= 00:06:00.907 11:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:00.907 11:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:00.907 11:29:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:00.907 11:29:30 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:00.907 11:29:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:00.907 00:06:00.907 real 0m2.574s 00:06:00.907 user 0m2.337s 00:06:00.907 sys 0m0.246s 00:06:00.907 11:29:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:00.907 11:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:00.907 ************************************ 00:06:00.907 END TEST accel_xor 00:06:00.907 ************************************ 00:06:01.166 11:29:30 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:01.166 11:29:30 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:01.166 11:29:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.166 11:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:01.166 ************************************ 00:06:01.166 START TEST accel_xor 00:06:01.166 ************************************ 00:06:01.166 11:29:30 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:01.166 11:29:30 -- accel/accel.sh@16 -- # local accel_opc 00:06:01.166 11:29:30 -- accel/accel.sh@17 -- # local accel_module 00:06:01.166 11:29:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:01.166 11:29:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:01.166 11:29:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.166 11:29:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.166 11:29:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.166 11:29:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.166 11:29:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.166 11:29:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.166 11:29:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.166 11:29:30 -- accel/accel.sh@42 -- # jq -r . 00:06:01.166 [2024-07-21 11:29:30.398552] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:01.166 [2024-07-21 11:29:30.398640] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2052235 ] 00:06:01.166 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.166 [2024-07-21 11:29:30.469019] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.166 [2024-07-21 11:29:30.505211] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.545 11:29:31 -- accel/accel.sh@18 -- # out=' 00:06:02.545 SPDK Configuration: 00:06:02.545 Core mask: 0x1 00:06:02.545 00:06:02.545 Accel Perf Configuration: 00:06:02.545 Workload Type: xor 00:06:02.545 Source buffers: 3 00:06:02.545 Transfer size: 4096 bytes 00:06:02.545 Vector count 1 00:06:02.545 Module: software 00:06:02.545 Queue depth: 32 00:06:02.545 Allocate depth: 32 00:06:02.545 # threads/core: 1 00:06:02.545 Run time: 1 seconds 00:06:02.545 Verify: Yes 00:06:02.545 00:06:02.545 Running for 1 seconds... 00:06:02.545 00:06:02.545 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:02.545 ------------------------------------------------------------------------------------ 00:06:02.545 0,0 635104/s 2480 MiB/s 0 0 00:06:02.545 ==================================================================================== 00:06:02.545 Total 635104/s 2480 MiB/s 0 0' 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:02.545 11:29:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:02.545 11:29:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:02.545 11:29:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:02.545 11:29:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:02.545 11:29:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:02.545 11:29:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:02.545 11:29:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:02.545 11:29:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:02.545 11:29:31 -- accel/accel.sh@42 -- # jq -r . 00:06:02.545 [2024-07-21 11:29:31.685577] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:02.545 [2024-07-21 11:29:31.685667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2052487 ] 00:06:02.545 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.545 [2024-07-21 11:29:31.754109] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.545 [2024-07-21 11:29:31.788300] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=0x1 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=xor 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=3 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=software 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=32 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=32 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=1 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val=Yes 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:02.545 11:29:31 -- accel/accel.sh@21 -- # val= 00:06:02.545 11:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:02.545 11:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@21 -- # val= 00:06:03.922 11:29:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # IFS=: 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@21 -- # val= 00:06:03.922 11:29:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # IFS=: 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@21 -- # val= 00:06:03.922 11:29:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # IFS=: 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@21 -- # val= 00:06:03.922 11:29:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # IFS=: 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@21 -- # val= 00:06:03.922 11:29:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # IFS=: 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@21 -- # val= 00:06:03.922 11:29:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # IFS=: 00:06:03.922 11:29:32 -- accel/accel.sh@20 -- # read -r var val 00:06:03.922 11:29:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:03.922 11:29:32 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:03.922 11:29:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:03.922 00:06:03.922 real 0m2.577s 00:06:03.922 user 0m2.319s 00:06:03.922 sys 0m0.265s 00:06:03.922 11:29:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.922 11:29:32 -- common/autotest_common.sh@10 -- # set +x 00:06:03.922 ************************************ 00:06:03.922 END TEST accel_xor 00:06:03.922 ************************************ 00:06:03.922 11:29:32 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:03.922 11:29:32 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:03.922 11:29:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:03.922 11:29:32 -- common/autotest_common.sh@10 -- # set +x 00:06:03.922 ************************************ 00:06:03.922 START TEST accel_dif_verify 00:06:03.922 ************************************ 00:06:03.922 11:29:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:03.922 11:29:33 -- accel/accel.sh@16 -- # local accel_opc 00:06:03.922 11:29:33 -- accel/accel.sh@17 -- # local accel_module 00:06:03.922 11:29:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:03.922 11:29:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:03.922 11:29:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:03.922 11:29:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:03.922 11:29:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.922 11:29:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.922 11:29:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:03.922 11:29:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:03.922 11:29:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:03.922 11:29:33 -- accel/accel.sh@42 -- # jq -r . 00:06:03.922 [2024-07-21 11:29:33.022963] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:03.922 [2024-07-21 11:29:33.023045] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2052777 ] 00:06:03.922 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.922 [2024-07-21 11:29:33.091774] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.922 [2024-07-21 11:29:33.126554] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.298 11:29:34 -- accel/accel.sh@18 -- # out=' 00:06:05.298 SPDK Configuration: 00:06:05.298 Core mask: 0x1 00:06:05.298 00:06:05.298 Accel Perf Configuration: 00:06:05.298 Workload Type: dif_verify 00:06:05.298 Vector size: 4096 bytes 00:06:05.298 Transfer size: 4096 bytes 00:06:05.298 Block size: 512 bytes 00:06:05.298 Metadata size: 8 bytes 00:06:05.298 Vector count 1 00:06:05.298 Module: software 00:06:05.298 Queue depth: 32 00:06:05.298 Allocate depth: 32 00:06:05.298 # threads/core: 1 00:06:05.298 Run time: 1 seconds 00:06:05.298 Verify: No 00:06:05.298 00:06:05.298 Running for 1 seconds... 00:06:05.298 00:06:05.298 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:05.298 ------------------------------------------------------------------------------------ 00:06:05.298 0,0 235168/s 932 MiB/s 0 0 00:06:05.298 ==================================================================================== 00:06:05.298 Total 235168/s 918 MiB/s 0 0' 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:05.298 11:29:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:05.298 11:29:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.298 11:29:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.298 11:29:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.298 11:29:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.298 11:29:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.298 11:29:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.298 11:29:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.298 11:29:34 -- accel/accel.sh@42 -- # jq -r . 00:06:05.298 [2024-07-21 11:29:34.306682] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:05.298 [2024-07-21 11:29:34.306764] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2053043 ] 00:06:05.298 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.298 [2024-07-21 11:29:34.374541] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.298 [2024-07-21 11:29:34.408728] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=0x1 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=dif_verify 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=software 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@23 -- # accel_module=software 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=32 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=32 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=1 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val=No 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:05.298 11:29:34 -- accel/accel.sh@21 -- # val= 00:06:05.298 11:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:05.298 11:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@21 -- # val= 00:06:06.233 11:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@21 -- # val= 00:06:06.233 11:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@21 -- # val= 00:06:06.233 11:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@21 -- # val= 00:06:06.233 11:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@21 -- # val= 00:06:06.233 11:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@21 -- # val= 00:06:06.233 11:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:06.233 11:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:06.233 11:29:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:06.233 11:29:35 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:06.233 11:29:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:06.233 00:06:06.233 real 0m2.573s 00:06:06.233 user 0m2.315s 00:06:06.233 sys 0m0.269s 00:06:06.233 11:29:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:06.233 11:29:35 -- common/autotest_common.sh@10 -- # set +x 00:06:06.233 ************************************ 00:06:06.233 END TEST accel_dif_verify 00:06:06.233 ************************************ 00:06:06.233 11:29:35 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:06.233 11:29:35 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:06.233 11:29:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:06.233 11:29:35 -- common/autotest_common.sh@10 -- # set +x 00:06:06.233 ************************************ 00:06:06.233 START TEST accel_dif_generate 00:06:06.233 ************************************ 00:06:06.233 11:29:35 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:06:06.233 11:29:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:06.233 11:29:35 -- accel/accel.sh@17 -- # local accel_module 00:06:06.233 11:29:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:06.233 11:29:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:06.233 11:29:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:06.233 11:29:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:06.233 11:29:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.233 11:29:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.233 11:29:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:06.233 11:29:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:06.233 11:29:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:06.233 11:29:35 -- accel/accel.sh@42 -- # jq -r . 00:06:06.233 [2024-07-21 11:29:35.644257] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:06.233 [2024-07-21 11:29:35.644349] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2053326 ] 00:06:06.491 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.491 [2024-07-21 11:29:35.714022] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.491 [2024-07-21 11:29:35.749174] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.865 11:29:36 -- accel/accel.sh@18 -- # out=' 00:06:07.865 SPDK Configuration: 00:06:07.865 Core mask: 0x1 00:06:07.865 00:06:07.865 Accel Perf Configuration: 00:06:07.865 Workload Type: dif_generate 00:06:07.865 Vector size: 4096 bytes 00:06:07.865 Transfer size: 4096 bytes 00:06:07.865 Block size: 512 bytes 00:06:07.865 Metadata size: 8 bytes 00:06:07.865 Vector count 1 00:06:07.865 Module: software 00:06:07.865 Queue depth: 32 00:06:07.865 Allocate depth: 32 00:06:07.865 # threads/core: 1 00:06:07.865 Run time: 1 seconds 00:06:07.865 Verify: No 00:06:07.865 00:06:07.865 Running for 1 seconds... 00:06:07.865 00:06:07.865 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:07.865 ------------------------------------------------------------------------------------ 00:06:07.865 0,0 290144/s 1151 MiB/s 0 0 00:06:07.865 ==================================================================================== 00:06:07.865 Total 290144/s 1133 MiB/s 0 0' 00:06:07.865 11:29:36 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:36 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:07.865 11:29:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:07.865 11:29:36 -- accel/accel.sh@12 -- # build_accel_config 00:06:07.865 11:29:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:07.865 11:29:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:07.865 11:29:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:07.865 11:29:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:07.865 11:29:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:07.865 11:29:36 -- accel/accel.sh@41 -- # local IFS=, 00:06:07.865 11:29:36 -- accel/accel.sh@42 -- # jq -r . 00:06:07.865 [2024-07-21 11:29:36.930617] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:07.865 [2024-07-21 11:29:36.930708] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2053532 ] 00:06:07.865 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.865 [2024-07-21 11:29:36.998860] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.865 [2024-07-21 11:29:37.033255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val=0x1 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val=dif_generate 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.865 11:29:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:07.865 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.865 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val=software 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@23 -- # accel_module=software 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val=32 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val=32 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val=1 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val=No 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:07.866 11:29:37 -- accel/accel.sh@21 -- # val= 00:06:07.866 11:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:07.866 11:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@21 -- # val= 00:06:08.799 11:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@21 -- # val= 00:06:08.799 11:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@21 -- # val= 00:06:08.799 11:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@21 -- # val= 00:06:08.799 11:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@21 -- # val= 00:06:08.799 11:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@21 -- # val= 00:06:08.799 11:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:08.799 11:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:08.799 11:29:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:08.799 11:29:38 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:08.799 11:29:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:08.799 00:06:08.799 real 0m2.578s 00:06:08.799 user 0m2.330s 00:06:08.799 sys 0m0.259s 00:06:08.799 11:29:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.799 11:29:38 -- common/autotest_common.sh@10 -- # set +x 00:06:08.799 ************************************ 00:06:08.799 END TEST accel_dif_generate 00:06:08.799 ************************************ 00:06:09.058 11:29:38 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:09.058 11:29:38 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:09.058 11:29:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:09.058 11:29:38 -- common/autotest_common.sh@10 -- # set +x 00:06:09.058 ************************************ 00:06:09.058 START TEST accel_dif_generate_copy 00:06:09.058 ************************************ 00:06:09.058 11:29:38 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:06:09.058 11:29:38 -- accel/accel.sh@16 -- # local accel_opc 00:06:09.058 11:29:38 -- accel/accel.sh@17 -- # local accel_module 00:06:09.058 11:29:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:09.058 11:29:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:09.058 11:29:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:09.058 11:29:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:09.058 11:29:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.058 11:29:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.058 11:29:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:09.058 11:29:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:09.058 11:29:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:09.058 11:29:38 -- accel/accel.sh@42 -- # jq -r . 00:06:09.058 [2024-07-21 11:29:38.272892] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:09.058 [2024-07-21 11:29:38.272982] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2053725 ] 00:06:09.058 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.058 [2024-07-21 11:29:38.344423] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.058 [2024-07-21 11:29:38.380168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.436 11:29:39 -- accel/accel.sh@18 -- # out=' 00:06:10.436 SPDK Configuration: 00:06:10.436 Core mask: 0x1 00:06:10.436 00:06:10.436 Accel Perf Configuration: 00:06:10.436 Workload Type: dif_generate_copy 00:06:10.436 Vector size: 4096 bytes 00:06:10.436 Transfer size: 4096 bytes 00:06:10.436 Vector count 1 00:06:10.436 Module: software 00:06:10.436 Queue depth: 32 00:06:10.436 Allocate depth: 32 00:06:10.436 # threads/core: 1 00:06:10.436 Run time: 1 seconds 00:06:10.436 Verify: No 00:06:10.436 00:06:10.436 Running for 1 seconds... 00:06:10.436 00:06:10.436 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:10.436 ------------------------------------------------------------------------------------ 00:06:10.436 0,0 224000/s 888 MiB/s 0 0 00:06:10.436 ==================================================================================== 00:06:10.436 Total 224000/s 875 MiB/s 0 0' 00:06:10.436 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.436 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.436 11:29:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:10.436 11:29:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:10.436 11:29:39 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.436 11:29:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.437 11:29:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.437 11:29:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.437 11:29:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.437 11:29:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.437 11:29:39 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.437 11:29:39 -- accel/accel.sh@42 -- # jq -r . 00:06:10.437 [2024-07-21 11:29:39.560730] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:10.437 [2024-07-21 11:29:39.560824] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2053900 ] 00:06:10.437 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.437 [2024-07-21 11:29:39.632281] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.437 [2024-07-21 11:29:39.667201] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=0x1 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=software 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@23 -- # accel_module=software 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=32 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=32 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=1 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val=No 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:10.437 11:29:39 -- accel/accel.sh@21 -- # val= 00:06:10.437 11:29:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:10.437 11:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@21 -- # val= 00:06:11.815 11:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@21 -- # val= 00:06:11.815 11:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@21 -- # val= 00:06:11.815 11:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@21 -- # val= 00:06:11.815 11:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@21 -- # val= 00:06:11.815 11:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@21 -- # val= 00:06:11.815 11:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:11.815 11:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:11.815 11:29:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:11.815 11:29:40 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:11.815 11:29:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:11.815 00:06:11.815 real 0m2.584s 00:06:11.815 user 0m2.331s 00:06:11.815 sys 0m0.261s 00:06:11.815 11:29:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.815 11:29:40 -- common/autotest_common.sh@10 -- # set +x 00:06:11.815 ************************************ 00:06:11.815 END TEST accel_dif_generate_copy 00:06:11.815 ************************************ 00:06:11.815 11:29:40 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:11.815 11:29:40 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.815 11:29:40 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:11.815 11:29:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.815 11:29:40 -- common/autotest_common.sh@10 -- # set +x 00:06:11.815 ************************************ 00:06:11.815 START TEST accel_comp 00:06:11.815 ************************************ 00:06:11.815 11:29:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.815 11:29:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:11.815 11:29:40 -- accel/accel.sh@17 -- # local accel_module 00:06:11.815 11:29:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.815 11:29:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.815 11:29:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.815 11:29:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.815 11:29:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.815 11:29:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.815 11:29:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.815 11:29:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.815 11:29:40 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.815 11:29:40 -- accel/accel.sh@42 -- # jq -r . 00:06:11.815 [2024-07-21 11:29:40.905866] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:11.815 [2024-07-21 11:29:40.905968] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2054186 ] 00:06:11.815 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.815 [2024-07-21 11:29:40.976245] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.815 [2024-07-21 11:29:41.011533] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.193 11:29:42 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:13.193 00:06:13.193 SPDK Configuration: 00:06:13.193 Core mask: 0x1 00:06:13.193 00:06:13.193 Accel Perf Configuration: 00:06:13.193 Workload Type: compress 00:06:13.193 Transfer size: 4096 bytes 00:06:13.193 Vector count 1 00:06:13.193 Module: software 00:06:13.193 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.193 Queue depth: 32 00:06:13.193 Allocate depth: 32 00:06:13.193 # threads/core: 1 00:06:13.193 Run time: 1 seconds 00:06:13.193 Verify: No 00:06:13.193 00:06:13.193 Running for 1 seconds... 00:06:13.193 00:06:13.193 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:13.193 ------------------------------------------------------------------------------------ 00:06:13.193 0,0 66528/s 277 MiB/s 0 0 00:06:13.193 ==================================================================================== 00:06:13.193 Total 66528/s 259 MiB/s 0 0' 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.193 11:29:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.193 11:29:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.193 11:29:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:13.193 11:29:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.193 11:29:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.193 11:29:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:13.193 11:29:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:13.193 11:29:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:13.193 11:29:42 -- accel/accel.sh@42 -- # jq -r . 00:06:13.193 [2024-07-21 11:29:42.197165] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:13.193 [2024-07-21 11:29:42.197256] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2054460 ] 00:06:13.193 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.193 [2024-07-21 11:29:42.267365] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.193 [2024-07-21 11:29:42.301856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=0x1 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=compress 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=software 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@23 -- # accel_module=software 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=32 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=32 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=1 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val=No 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:13.193 11:29:42 -- accel/accel.sh@21 -- # val= 00:06:13.193 11:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:13.193 11:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@21 -- # val= 00:06:14.129 11:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@21 -- # val= 00:06:14.129 11:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@21 -- # val= 00:06:14.129 11:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@21 -- # val= 00:06:14.129 11:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@21 -- # val= 00:06:14.129 11:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@21 -- # val= 00:06:14.129 11:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:14.129 11:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:14.129 11:29:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:14.129 11:29:43 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:14.129 11:29:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:14.129 00:06:14.129 real 0m2.585s 00:06:14.129 user 0m2.327s 00:06:14.129 sys 0m0.267s 00:06:14.129 11:29:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.129 11:29:43 -- common/autotest_common.sh@10 -- # set +x 00:06:14.129 ************************************ 00:06:14.129 END TEST accel_comp 00:06:14.129 ************************************ 00:06:14.129 11:29:43 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:14.129 11:29:43 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:14.129 11:29:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:14.129 11:29:43 -- common/autotest_common.sh@10 -- # set +x 00:06:14.129 ************************************ 00:06:14.129 START TEST accel_decomp 00:06:14.129 ************************************ 00:06:14.129 11:29:43 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:14.129 11:29:43 -- accel/accel.sh@16 -- # local accel_opc 00:06:14.129 11:29:43 -- accel/accel.sh@17 -- # local accel_module 00:06:14.129 11:29:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:14.129 11:29:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:14.129 11:29:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.129 11:29:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.129 11:29:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.129 11:29:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.130 11:29:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.130 11:29:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.130 11:29:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.130 11:29:43 -- accel/accel.sh@42 -- # jq -r . 00:06:14.130 [2024-07-21 11:29:43.540859] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:14.130 [2024-07-21 11:29:43.540965] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2054741 ] 00:06:14.387 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.387 [2024-07-21 11:29:43.611252] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.387 [2024-07-21 11:29:43.646345] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.761 11:29:44 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:15.761 00:06:15.761 SPDK Configuration: 00:06:15.761 Core mask: 0x1 00:06:15.761 00:06:15.761 Accel Perf Configuration: 00:06:15.761 Workload Type: decompress 00:06:15.761 Transfer size: 4096 bytes 00:06:15.761 Vector count 1 00:06:15.761 Module: software 00:06:15.761 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:15.761 Queue depth: 32 00:06:15.761 Allocate depth: 32 00:06:15.761 # threads/core: 1 00:06:15.761 Run time: 1 seconds 00:06:15.761 Verify: Yes 00:06:15.761 00:06:15.761 Running for 1 seconds... 00:06:15.761 00:06:15.761 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:15.761 ------------------------------------------------------------------------------------ 00:06:15.761 0,0 93024/s 171 MiB/s 0 0 00:06:15.761 ==================================================================================== 00:06:15.761 Total 93024/s 363 MiB/s 0 0' 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:15.761 11:29:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:15.761 11:29:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:15.761 11:29:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:15.761 11:29:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:15.761 11:29:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:15.761 11:29:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:15.761 11:29:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:15.761 11:29:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:15.761 11:29:44 -- accel/accel.sh@42 -- # jq -r . 00:06:15.761 [2024-07-21 11:29:44.831057] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:15.761 [2024-07-21 11:29:44.831146] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2055009 ] 00:06:15.761 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.761 [2024-07-21 11:29:44.900350] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.761 [2024-07-21 11:29:44.934522] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=0x1 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=decompress 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=software 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=32 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=32 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val=1 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.761 11:29:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:15.761 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.761 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.762 11:29:44 -- accel/accel.sh@21 -- # val=Yes 00:06:15.762 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.762 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.762 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.762 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.762 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.762 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.762 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:15.762 11:29:44 -- accel/accel.sh@21 -- # val= 00:06:15.762 11:29:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.762 11:29:44 -- accel/accel.sh@20 -- # IFS=: 00:06:15.762 11:29:44 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@21 -- # val= 00:06:16.694 11:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@21 -- # val= 00:06:16.694 11:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@21 -- # val= 00:06:16.694 11:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@21 -- # val= 00:06:16.694 11:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@21 -- # val= 00:06:16.694 11:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@21 -- # val= 00:06:16.694 11:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:16.694 11:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:16.694 11:29:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:16.694 11:29:46 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:16.694 11:29:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.694 00:06:16.694 real 0m2.586s 00:06:16.694 user 0m2.335s 00:06:16.694 sys 0m0.260s 00:06:16.694 11:29:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.694 11:29:46 -- common/autotest_common.sh@10 -- # set +x 00:06:16.694 ************************************ 00:06:16.694 END TEST accel_decomp 00:06:16.694 ************************************ 00:06:16.953 11:29:46 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.953 11:29:46 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:16.953 11:29:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:16.953 11:29:46 -- common/autotest_common.sh@10 -- # set +x 00:06:16.953 ************************************ 00:06:16.953 START TEST accel_decmop_full 00:06:16.953 ************************************ 00:06:16.953 11:29:46 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.953 11:29:46 -- accel/accel.sh@16 -- # local accel_opc 00:06:16.953 11:29:46 -- accel/accel.sh@17 -- # local accel_module 00:06:16.953 11:29:46 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.953 11:29:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.953 11:29:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.953 11:29:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.953 11:29:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.953 11:29:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.953 11:29:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.953 11:29:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.953 11:29:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.953 11:29:46 -- accel/accel.sh@42 -- # jq -r . 00:06:16.953 [2024-07-21 11:29:46.175866] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:16.954 [2024-07-21 11:29:46.175956] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2055221 ] 00:06:16.954 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.954 [2024-07-21 11:29:46.246365] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.954 [2024-07-21 11:29:46.282156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.330 11:29:47 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:18.330 00:06:18.330 SPDK Configuration: 00:06:18.330 Core mask: 0x1 00:06:18.330 00:06:18.330 Accel Perf Configuration: 00:06:18.330 Workload Type: decompress 00:06:18.330 Transfer size: 111250 bytes 00:06:18.330 Vector count 1 00:06:18.330 Module: software 00:06:18.330 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:18.330 Queue depth: 32 00:06:18.330 Allocate depth: 32 00:06:18.330 # threads/core: 1 00:06:18.330 Run time: 1 seconds 00:06:18.330 Verify: Yes 00:06:18.330 00:06:18.330 Running for 1 seconds... 00:06:18.330 00:06:18.330 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:18.330 ------------------------------------------------------------------------------------ 00:06:18.330 0,0 5888/s 243 MiB/s 0 0 00:06:18.330 ==================================================================================== 00:06:18.330 Total 5888/s 624 MiB/s 0 0' 00:06:18.330 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.330 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.330 11:29:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:18.330 11:29:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:18.330 11:29:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.330 11:29:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.330 11:29:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.331 11:29:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.331 11:29:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.331 11:29:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.331 11:29:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.331 11:29:47 -- accel/accel.sh@42 -- # jq -r . 00:06:18.331 [2024-07-21 11:29:47.471411] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:18.331 [2024-07-21 11:29:47.471510] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2055373 ] 00:06:18.331 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.331 [2024-07-21 11:29:47.539917] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.331 [2024-07-21 11:29:47.574306] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=0x1 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=decompress 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=software 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@23 -- # accel_module=software 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=32 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=32 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=1 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val=Yes 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:18.331 11:29:47 -- accel/accel.sh@21 -- # val= 00:06:18.331 11:29:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # IFS=: 00:06:18.331 11:29:47 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@21 -- # val= 00:06:19.726 11:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@21 -- # val= 00:06:19.726 11:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@21 -- # val= 00:06:19.726 11:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@21 -- # val= 00:06:19.726 11:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@21 -- # val= 00:06:19.726 11:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@21 -- # val= 00:06:19.726 11:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:19.726 11:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:19.726 11:29:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:19.726 11:29:48 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:19.726 11:29:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:19.726 00:06:19.726 real 0m2.597s 00:06:19.726 user 0m2.339s 00:06:19.726 sys 0m0.265s 00:06:19.726 11:29:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.726 11:29:48 -- common/autotest_common.sh@10 -- # set +x 00:06:19.726 ************************************ 00:06:19.726 END TEST accel_decmop_full 00:06:19.726 ************************************ 00:06:19.726 11:29:48 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:19.726 11:29:48 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:19.726 11:29:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.726 11:29:48 -- common/autotest_common.sh@10 -- # set +x 00:06:19.726 ************************************ 00:06:19.726 START TEST accel_decomp_mcore 00:06:19.726 ************************************ 00:06:19.726 11:29:48 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:19.726 11:29:48 -- accel/accel.sh@16 -- # local accel_opc 00:06:19.726 11:29:48 -- accel/accel.sh@17 -- # local accel_module 00:06:19.726 11:29:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:19.726 11:29:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:19.726 11:29:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:19.726 11:29:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:19.726 11:29:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:19.726 11:29:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:19.726 11:29:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:19.726 11:29:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:19.726 11:29:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:19.726 11:29:48 -- accel/accel.sh@42 -- # jq -r . 00:06:19.726 [2024-07-21 11:29:48.821432] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:19.726 [2024-07-21 11:29:48.821526] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2055601 ] 00:06:19.726 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.727 [2024-07-21 11:29:48.891432] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:19.727 [2024-07-21 11:29:48.929799] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:19.727 [2024-07-21 11:29:48.929823] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:19.727 [2024-07-21 11:29:48.929907] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:19.727 [2024-07-21 11:29:48.929909] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.699 11:29:50 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:20.699 00:06:20.699 SPDK Configuration: 00:06:20.699 Core mask: 0xf 00:06:20.699 00:06:20.699 Accel Perf Configuration: 00:06:20.699 Workload Type: decompress 00:06:20.699 Transfer size: 4096 bytes 00:06:20.699 Vector count 1 00:06:20.699 Module: software 00:06:20.699 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:20.699 Queue depth: 32 00:06:20.699 Allocate depth: 32 00:06:20.699 # threads/core: 1 00:06:20.699 Run time: 1 seconds 00:06:20.699 Verify: Yes 00:06:20.699 00:06:20.699 Running for 1 seconds... 00:06:20.699 00:06:20.699 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:20.699 ------------------------------------------------------------------------------------ 00:06:20.699 0,0 78144/s 143 MiB/s 0 0 00:06:20.699 3,0 78912/s 145 MiB/s 0 0 00:06:20.699 2,0 78624/s 144 MiB/s 0 0 00:06:20.699 1,0 78784/s 145 MiB/s 0 0 00:06:20.699 ==================================================================================== 00:06:20.699 Total 314464/s 1228 MiB/s 0 0' 00:06:20.699 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.699 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.699 11:29:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:20.699 11:29:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:20.699 11:29:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.699 11:29:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.699 11:29:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.699 11:29:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.699 11:29:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.699 11:29:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.699 11:29:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.699 11:29:50 -- accel/accel.sh@42 -- # jq -r . 00:06:20.957 [2024-07-21 11:29:50.126474] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:20.957 [2024-07-21 11:29:50.126564] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2055875 ] 00:06:20.957 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.957 [2024-07-21 11:29:50.197380] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:20.957 [2024-07-21 11:29:50.235160] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:20.957 [2024-07-21 11:29:50.235257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:20.957 [2024-07-21 11:29:50.235342] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:20.957 [2024-07-21 11:29:50.235343] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=0xf 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=decompress 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=software 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@23 -- # accel_module=software 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=32 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=32 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=1 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val=Yes 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:20.957 11:29:50 -- accel/accel.sh@21 -- # val= 00:06:20.957 11:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:20.957 11:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.331 11:29:51 -- accel/accel.sh@21 -- # val= 00:06:22.331 11:29:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # IFS=: 00:06:22.331 11:29:51 -- accel/accel.sh@20 -- # read -r var val 00:06:22.332 11:29:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:22.332 11:29:51 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:22.332 11:29:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:22.332 00:06:22.332 real 0m2.616s 00:06:22.332 user 0m9.006s 00:06:22.332 sys 0m0.276s 00:06:22.332 11:29:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.332 11:29:51 -- common/autotest_common.sh@10 -- # set +x 00:06:22.332 ************************************ 00:06:22.332 END TEST accel_decomp_mcore 00:06:22.332 ************************************ 00:06:22.332 11:29:51 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:22.332 11:29:51 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:22.332 11:29:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:22.332 11:29:51 -- common/autotest_common.sh@10 -- # set +x 00:06:22.332 ************************************ 00:06:22.332 START TEST accel_decomp_full_mcore 00:06:22.332 ************************************ 00:06:22.332 11:29:51 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:22.332 11:29:51 -- accel/accel.sh@16 -- # local accel_opc 00:06:22.332 11:29:51 -- accel/accel.sh@17 -- # local accel_module 00:06:22.332 11:29:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:22.332 11:29:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:22.332 11:29:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.332 11:29:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.332 11:29:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.332 11:29:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.332 11:29:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.332 11:29:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.332 11:29:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.332 11:29:51 -- accel/accel.sh@42 -- # jq -r . 00:06:22.332 [2024-07-21 11:29:51.486159] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:22.332 [2024-07-21 11:29:51.486252] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2056167 ] 00:06:22.332 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.332 [2024-07-21 11:29:51.555767] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:22.332 [2024-07-21 11:29:51.593916] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:22.332 [2024-07-21 11:29:51.594011] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:22.332 [2024-07-21 11:29:51.594098] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:22.332 [2024-07-21 11:29:51.594101] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.708 11:29:52 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:23.708 00:06:23.708 SPDK Configuration: 00:06:23.708 Core mask: 0xf 00:06:23.708 00:06:23.708 Accel Perf Configuration: 00:06:23.708 Workload Type: decompress 00:06:23.708 Transfer size: 111250 bytes 00:06:23.708 Vector count 1 00:06:23.708 Module: software 00:06:23.708 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:23.708 Queue depth: 32 00:06:23.708 Allocate depth: 32 00:06:23.708 # threads/core: 1 00:06:23.708 Run time: 1 seconds 00:06:23.708 Verify: Yes 00:06:23.708 00:06:23.708 Running for 1 seconds... 00:06:23.708 00:06:23.708 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:23.708 ------------------------------------------------------------------------------------ 00:06:23.708 0,0 5792/s 239 MiB/s 0 0 00:06:23.708 3,0 5824/s 240 MiB/s 0 0 00:06:23.708 2,0 5824/s 240 MiB/s 0 0 00:06:23.708 1,0 5824/s 240 MiB/s 0 0 00:06:23.708 ==================================================================================== 00:06:23.708 Total 23264/s 2468 MiB/s 0 0' 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:23.708 11:29:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:23.708 11:29:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.708 11:29:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.708 11:29:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.708 11:29:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.708 11:29:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.708 11:29:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.708 11:29:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.708 11:29:52 -- accel/accel.sh@42 -- # jq -r . 00:06:23.708 [2024-07-21 11:29:52.792890] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:23.708 [2024-07-21 11:29:52.792979] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2056436 ] 00:06:23.708 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.708 [2024-07-21 11:29:52.861571] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:23.708 [2024-07-21 11:29:52.898678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.708 [2024-07-21 11:29:52.898775] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:23.708 [2024-07-21 11:29:52.898861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:23.708 [2024-07-21 11:29:52.898864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=0xf 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=decompress 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=software 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@23 -- # accel_module=software 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=32 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=32 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=1 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val=Yes 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:23.708 11:29:52 -- accel/accel.sh@21 -- # val= 00:06:23.708 11:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:23.708 11:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@21 -- # val= 00:06:25.092 11:29:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # IFS=: 00:06:25.092 11:29:54 -- accel/accel.sh@20 -- # read -r var val 00:06:25.092 11:29:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:25.092 11:29:54 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:25.092 11:29:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:25.092 00:06:25.092 real 0m2.624s 00:06:25.092 user 0m9.057s 00:06:25.092 sys 0m0.278s 00:06:25.092 11:29:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.092 11:29:54 -- common/autotest_common.sh@10 -- # set +x 00:06:25.092 ************************************ 00:06:25.092 END TEST accel_decomp_full_mcore 00:06:25.092 ************************************ 00:06:25.092 11:29:54 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:25.092 11:29:54 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:25.092 11:29:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:25.092 11:29:54 -- common/autotest_common.sh@10 -- # set +x 00:06:25.092 ************************************ 00:06:25.092 START TEST accel_decomp_mthread 00:06:25.092 ************************************ 00:06:25.092 11:29:54 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:25.092 11:29:54 -- accel/accel.sh@16 -- # local accel_opc 00:06:25.092 11:29:54 -- accel/accel.sh@17 -- # local accel_module 00:06:25.092 11:29:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:25.092 11:29:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:25.092 11:29:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.092 11:29:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:25.092 11:29:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.092 11:29:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.092 11:29:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:25.092 11:29:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:25.092 11:29:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:25.092 11:29:54 -- accel/accel.sh@42 -- # jq -r . 00:06:25.092 [2024-07-21 11:29:54.157075] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:25.092 [2024-07-21 11:29:54.157165] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2056722 ] 00:06:25.092 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.092 [2024-07-21 11:29:54.225626] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.092 [2024-07-21 11:29:54.260916] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.026 11:29:55 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:26.026 00:06:26.026 SPDK Configuration: 00:06:26.026 Core mask: 0x1 00:06:26.026 00:06:26.026 Accel Perf Configuration: 00:06:26.026 Workload Type: decompress 00:06:26.026 Transfer size: 4096 bytes 00:06:26.026 Vector count 1 00:06:26.026 Module: software 00:06:26.026 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:26.026 Queue depth: 32 00:06:26.026 Allocate depth: 32 00:06:26.026 # threads/core: 2 00:06:26.026 Run time: 1 seconds 00:06:26.026 Verify: Yes 00:06:26.026 00:06:26.026 Running for 1 seconds... 00:06:26.026 00:06:26.026 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:26.026 ------------------------------------------------------------------------------------ 00:06:26.026 0,1 47488/s 87 MiB/s 0 0 00:06:26.026 0,0 47360/s 87 MiB/s 0 0 00:06:26.026 ==================================================================================== 00:06:26.026 Total 94848/s 370 MiB/s 0 0' 00:06:26.026 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.026 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.026 11:29:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:26.026 11:29:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:26.026 11:29:55 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.026 11:29:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.026 11:29:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.026 11:29:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.026 11:29:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.026 11:29:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.026 11:29:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.026 11:29:55 -- accel/accel.sh@42 -- # jq -r . 00:06:26.026 [2024-07-21 11:29:55.448808] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:26.026 [2024-07-21 11:29:55.448896] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2056895 ] 00:06:26.284 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.284 [2024-07-21 11:29:55.518891] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.284 [2024-07-21 11:29:55.553810] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val=0x1 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.284 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.284 11:29:55 -- accel/accel.sh@21 -- # val=decompress 00:06:26.284 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.284 11:29:55 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val=software 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@23 -- # accel_module=software 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val=32 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val=32 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val=2 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val=Yes 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:26.285 11:29:55 -- accel/accel.sh@21 -- # val= 00:06:26.285 11:29:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # IFS=: 00:06:26.285 11:29:55 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@21 -- # val= 00:06:27.659 11:29:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # IFS=: 00:06:27.659 11:29:56 -- accel/accel.sh@20 -- # read -r var val 00:06:27.659 11:29:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:27.659 11:29:56 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:27.659 11:29:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:27.659 00:06:27.659 real 0m2.592s 00:06:27.659 user 0m2.341s 00:06:27.659 sys 0m0.261s 00:06:27.659 11:29:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.659 11:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:27.659 ************************************ 00:06:27.659 END TEST accel_decomp_mthread 00:06:27.659 ************************************ 00:06:27.659 11:29:56 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:27.659 11:29:56 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:27.659 11:29:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.659 11:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:27.659 ************************************ 00:06:27.659 START TEST accel_deomp_full_mthread 00:06:27.659 ************************************ 00:06:27.659 11:29:56 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:27.659 11:29:56 -- accel/accel.sh@16 -- # local accel_opc 00:06:27.659 11:29:56 -- accel/accel.sh@17 -- # local accel_module 00:06:27.659 11:29:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:27.659 11:29:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:27.659 11:29:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.659 11:29:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:27.659 11:29:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.659 11:29:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.659 11:29:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:27.659 11:29:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:27.659 11:29:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:27.659 11:29:56 -- accel/accel.sh@42 -- # jq -r . 00:06:27.659 [2024-07-21 11:29:56.799888] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:27.659 [2024-07-21 11:29:56.799973] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057090 ] 00:06:27.659 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.659 [2024-07-21 11:29:56.870051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.659 [2024-07-21 11:29:56.906446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.071 11:29:58 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:29.071 00:06:29.071 SPDK Configuration: 00:06:29.071 Core mask: 0x1 00:06:29.071 00:06:29.071 Accel Perf Configuration: 00:06:29.071 Workload Type: decompress 00:06:29.071 Transfer size: 111250 bytes 00:06:29.071 Vector count 1 00:06:29.071 Module: software 00:06:29.071 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:29.071 Queue depth: 32 00:06:29.071 Allocate depth: 32 00:06:29.071 # threads/core: 2 00:06:29.071 Run time: 1 seconds 00:06:29.071 Verify: Yes 00:06:29.071 00:06:29.071 Running for 1 seconds... 00:06:29.071 00:06:29.071 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:29.071 ------------------------------------------------------------------------------------ 00:06:29.071 0,1 2976/s 122 MiB/s 0 0 00:06:29.071 0,0 2944/s 121 MiB/s 0 0 00:06:29.071 ==================================================================================== 00:06:29.071 Total 5920/s 628 MiB/s 0 0' 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:29.071 11:29:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:29.071 11:29:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.071 11:29:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.071 11:29:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.071 11:29:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.071 11:29:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.071 11:29:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.071 11:29:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.071 11:29:58 -- accel/accel.sh@42 -- # jq -r . 00:06:29.071 [2024-07-21 11:29:58.111875] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:29.071 [2024-07-21 11:29:58.111955] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057301 ] 00:06:29.071 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.071 [2024-07-21 11:29:58.180493] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.071 [2024-07-21 11:29:58.214820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=0x1 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=decompress 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=software 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@23 -- # accel_module=software 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=32 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=32 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=2 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val=Yes 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:29.071 11:29:58 -- accel/accel.sh@21 -- # val= 00:06:29.071 11:29:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # IFS=: 00:06:29.071 11:29:58 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@21 -- # val= 00:06:30.008 11:29:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # IFS=: 00:06:30.008 11:29:59 -- accel/accel.sh@20 -- # read -r var val 00:06:30.008 11:29:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:30.008 11:29:59 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:30.008 11:29:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:30.008 00:06:30.008 real 0m2.629s 00:06:30.008 user 0m2.375s 00:06:30.008 sys 0m0.264s 00:06:30.008 11:29:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.008 11:29:59 -- common/autotest_common.sh@10 -- # set +x 00:06:30.008 ************************************ 00:06:30.008 END TEST accel_deomp_full_mthread 00:06:30.008 ************************************ 00:06:30.268 11:29:59 -- accel/accel.sh@116 -- # [[ n == y ]] 00:06:30.268 11:29:59 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:30.268 11:29:59 -- accel/accel.sh@129 -- # build_accel_config 00:06:30.268 11:29:59 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:06:30.268 11:29:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:30.268 11:29:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.268 11:29:59 -- common/autotest_common.sh@10 -- # set +x 00:06:30.268 11:29:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.268 11:29:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.268 11:29:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.268 11:29:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.268 11:29:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.268 11:29:59 -- accel/accel.sh@42 -- # jq -r . 00:06:30.268 ************************************ 00:06:30.268 START TEST accel_dif_functional_tests 00:06:30.268 ************************************ 00:06:30.268 11:29:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:30.268 [2024-07-21 11:29:59.482583] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:30.268 [2024-07-21 11:29:59.482674] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057587 ] 00:06:30.268 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.268 [2024-07-21 11:29:59.552407] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:30.268 [2024-07-21 11:29:59.589107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:30.268 [2024-07-21 11:29:59.589204] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:30.268 [2024-07-21 11:29:59.589205] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.268 00:06:30.268 00:06:30.268 CUnit - A unit testing framework for C - Version 2.1-3 00:06:30.268 http://cunit.sourceforge.net/ 00:06:30.268 00:06:30.268 00:06:30.268 Suite: accel_dif 00:06:30.268 Test: verify: DIF generated, GUARD check ...passed 00:06:30.268 Test: verify: DIF generated, APPTAG check ...passed 00:06:30.268 Test: verify: DIF generated, REFTAG check ...passed 00:06:30.268 Test: verify: DIF not generated, GUARD check ...[2024-07-21 11:29:59.650896] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:30.268 [2024-07-21 11:29:59.650945] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:30.268 passed 00:06:30.268 Test: verify: DIF not generated, APPTAG check ...[2024-07-21 11:29:59.650994] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:30.268 [2024-07-21 11:29:59.651013] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:30.268 passed 00:06:30.268 Test: verify: DIF not generated, REFTAG check ...[2024-07-21 11:29:59.651033] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:30.268 [2024-07-21 11:29:59.651050] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:30.268 passed 00:06:30.268 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:30.268 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-21 11:29:59.651092] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:30.268 passed 00:06:30.268 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:30.268 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:30.268 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:30.268 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-21 11:29:59.651190] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:30.268 passed 00:06:30.268 Test: generate copy: DIF generated, GUARD check ...passed 00:06:30.268 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:30.268 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:30.268 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:30.268 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:30.268 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:30.268 Test: generate copy: iovecs-len validate ...[2024-07-21 11:29:59.651365] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:30.268 passed 00:06:30.268 Test: generate copy: buffer alignment validate ...passed 00:06:30.268 00:06:30.268 Run Summary: Type Total Ran Passed Failed Inactive 00:06:30.268 suites 1 1 n/a 0 0 00:06:30.268 tests 20 20 20 0 0 00:06:30.268 asserts 204 204 204 0 n/a 00:06:30.268 00:06:30.268 Elapsed time = 0.000 seconds 00:06:30.528 00:06:30.528 real 0m0.345s 00:06:30.528 user 0m0.529s 00:06:30.528 sys 0m0.154s 00:06:30.528 11:29:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.528 11:29:59 -- common/autotest_common.sh@10 -- # set +x 00:06:30.528 ************************************ 00:06:30.528 END TEST accel_dif_functional_tests 00:06:30.528 ************************************ 00:06:30.528 00:06:30.528 real 0m55.349s 00:06:30.528 user 1m2.834s 00:06:30.528 sys 0m7.213s 00:06:30.528 11:29:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.528 11:29:59 -- common/autotest_common.sh@10 -- # set +x 00:06:30.528 ************************************ 00:06:30.528 END TEST accel 00:06:30.528 ************************************ 00:06:30.528 11:29:59 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:30.528 11:29:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:30.528 11:29:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:30.528 11:29:59 -- common/autotest_common.sh@10 -- # set +x 00:06:30.528 ************************************ 00:06:30.528 START TEST accel_rpc 00:06:30.528 ************************************ 00:06:30.528 11:29:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:30.786 * Looking for test storage... 00:06:30.786 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2057788 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@15 -- # waitforlisten 2057788 00:06:30.786 11:30:00 -- common/autotest_common.sh@819 -- # '[' -z 2057788 ']' 00:06:30.786 11:30:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.786 11:30:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:30.786 11:30:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.786 11:30:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:30.786 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:30.786 [2024-07-21 11:30:00.026264] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:30.786 [2024-07-21 11:30:00.026334] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057788 ] 00:06:30.786 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.786 [2024-07-21 11:30:00.095393] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.786 [2024-07-21 11:30:00.132317] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.786 [2024-07-21 11:30:00.132437] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.786 11:30:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:30.786 11:30:00 -- common/autotest_common.sh@852 -- # return 0 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:30.786 11:30:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:30.786 11:30:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:30.786 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:30.786 ************************************ 00:06:30.786 START TEST accel_assign_opcode 00:06:30.786 ************************************ 00:06:30.786 11:30:00 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:06:30.786 11:30:00 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:30.786 11:30:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:30.786 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:30.787 [2024-07-21 11:30:00.188931] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:30.787 11:30:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:30.787 11:30:00 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:30.787 11:30:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:30.787 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:30.787 [2024-07-21 11:30:00.196939] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:30.787 11:30:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:30.787 11:30:00 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:30.787 11:30:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:30.787 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.045 11:30:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:31.045 11:30:00 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:31.045 11:30:00 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:31.045 11:30:00 -- accel/accel_rpc.sh@42 -- # grep software 00:06:31.045 11:30:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:31.045 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.045 11:30:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:31.045 software 00:06:31.045 00:06:31.045 real 0m0.202s 00:06:31.045 user 0m0.029s 00:06:31.045 sys 0m0.008s 00:06:31.045 11:30:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.045 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.045 ************************************ 00:06:31.045 END TEST accel_assign_opcode 00:06:31.045 ************************************ 00:06:31.045 11:30:00 -- accel/accel_rpc.sh@55 -- # killprocess 2057788 00:06:31.045 11:30:00 -- common/autotest_common.sh@926 -- # '[' -z 2057788 ']' 00:06:31.045 11:30:00 -- common/autotest_common.sh@930 -- # kill -0 2057788 00:06:31.045 11:30:00 -- common/autotest_common.sh@931 -- # uname 00:06:31.045 11:30:00 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:31.045 11:30:00 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2057788 00:06:31.305 11:30:00 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:31.305 11:30:00 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:31.305 11:30:00 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2057788' 00:06:31.305 killing process with pid 2057788 00:06:31.305 11:30:00 -- common/autotest_common.sh@945 -- # kill 2057788 00:06:31.305 11:30:00 -- common/autotest_common.sh@950 -- # wait 2057788 00:06:31.565 00:06:31.565 real 0m0.871s 00:06:31.565 user 0m0.765s 00:06:31.565 sys 0m0.421s 00:06:31.565 11:30:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.565 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.565 ************************************ 00:06:31.565 END TEST accel_rpc 00:06:31.565 ************************************ 00:06:31.565 11:30:00 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:31.565 11:30:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:31.565 11:30:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.565 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.565 ************************************ 00:06:31.565 START TEST app_cmdline 00:06:31.565 ************************************ 00:06:31.565 11:30:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:31.565 * Looking for test storage... 00:06:31.565 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:31.565 11:30:00 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:31.565 11:30:00 -- app/cmdline.sh@17 -- # spdk_tgt_pid=2058046 00:06:31.565 11:30:00 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:31.565 11:30:00 -- app/cmdline.sh@18 -- # waitforlisten 2058046 00:06:31.565 11:30:00 -- common/autotest_common.sh@819 -- # '[' -z 2058046 ']' 00:06:31.565 11:30:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.565 11:30:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:31.565 11:30:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.565 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.565 11:30:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:31.565 11:30:00 -- common/autotest_common.sh@10 -- # set +x 00:06:31.565 [2024-07-21 11:30:00.930928] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:31.565 [2024-07-21 11:30:00.931009] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2058046 ] 00:06:31.565 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.824 [2024-07-21 11:30:01.000845] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.824 [2024-07-21 11:30:01.038360] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:31.824 [2024-07-21 11:30:01.038522] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.391 11:30:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:32.391 11:30:01 -- common/autotest_common.sh@852 -- # return 0 00:06:32.391 11:30:01 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:32.649 { 00:06:32.649 "version": "SPDK v24.01.1-pre git sha1 4b94202c6", 00:06:32.649 "fields": { 00:06:32.649 "major": 24, 00:06:32.649 "minor": 1, 00:06:32.649 "patch": 1, 00:06:32.649 "suffix": "-pre", 00:06:32.649 "commit": "4b94202c6" 00:06:32.649 } 00:06:32.649 } 00:06:32.649 11:30:01 -- app/cmdline.sh@22 -- # expected_methods=() 00:06:32.649 11:30:01 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:32.649 11:30:01 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:32.649 11:30:01 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:32.649 11:30:01 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:32.649 11:30:01 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:32.649 11:30:01 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:32.649 11:30:01 -- common/autotest_common.sh@10 -- # set +x 00:06:32.649 11:30:01 -- app/cmdline.sh@26 -- # sort 00:06:32.649 11:30:01 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:32.649 11:30:01 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:32.649 11:30:01 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:32.649 11:30:01 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:32.649 11:30:01 -- common/autotest_common.sh@640 -- # local es=0 00:06:32.649 11:30:01 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:32.649 11:30:01 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.649 11:30:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.649 11:30:01 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.649 11:30:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.649 11:30:01 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.649 11:30:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.649 11:30:01 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.649 11:30:01 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:06:32.649 11:30:01 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:32.906 request: 00:06:32.906 { 00:06:32.906 "method": "env_dpdk_get_mem_stats", 00:06:32.906 "req_id": 1 00:06:32.906 } 00:06:32.907 Got JSON-RPC error response 00:06:32.907 response: 00:06:32.907 { 00:06:32.907 "code": -32601, 00:06:32.907 "message": "Method not found" 00:06:32.907 } 00:06:32.907 11:30:02 -- common/autotest_common.sh@643 -- # es=1 00:06:32.907 11:30:02 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:32.907 11:30:02 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:32.907 11:30:02 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:32.907 11:30:02 -- app/cmdline.sh@1 -- # killprocess 2058046 00:06:32.907 11:30:02 -- common/autotest_common.sh@926 -- # '[' -z 2058046 ']' 00:06:32.907 11:30:02 -- common/autotest_common.sh@930 -- # kill -0 2058046 00:06:32.907 11:30:02 -- common/autotest_common.sh@931 -- # uname 00:06:32.907 11:30:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:32.907 11:30:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2058046 00:06:32.907 11:30:02 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:32.907 11:30:02 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:32.907 11:30:02 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2058046' 00:06:32.907 killing process with pid 2058046 00:06:32.907 11:30:02 -- common/autotest_common.sh@945 -- # kill 2058046 00:06:32.907 11:30:02 -- common/autotest_common.sh@950 -- # wait 2058046 00:06:33.165 00:06:33.165 real 0m1.645s 00:06:33.165 user 0m1.903s 00:06:33.165 sys 0m0.485s 00:06:33.165 11:30:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.165 11:30:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.165 ************************************ 00:06:33.165 END TEST app_cmdline 00:06:33.165 ************************************ 00:06:33.165 11:30:02 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:33.165 11:30:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:33.165 11:30:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:33.165 11:30:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.165 ************************************ 00:06:33.165 START TEST version 00:06:33.165 ************************************ 00:06:33.165 11:30:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:33.165 * Looking for test storage... 00:06:33.165 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:33.165 11:30:02 -- app/version.sh@17 -- # get_header_version major 00:06:33.165 11:30:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.165 11:30:02 -- app/version.sh@14 -- # cut -f2 00:06:33.165 11:30:02 -- app/version.sh@14 -- # tr -d '"' 00:06:33.165 11:30:02 -- app/version.sh@17 -- # major=24 00:06:33.423 11:30:02 -- app/version.sh@18 -- # get_header_version minor 00:06:33.423 11:30:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.423 11:30:02 -- app/version.sh@14 -- # cut -f2 00:06:33.423 11:30:02 -- app/version.sh@14 -- # tr -d '"' 00:06:33.423 11:30:02 -- app/version.sh@18 -- # minor=1 00:06:33.423 11:30:02 -- app/version.sh@19 -- # get_header_version patch 00:06:33.423 11:30:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.423 11:30:02 -- app/version.sh@14 -- # cut -f2 00:06:33.423 11:30:02 -- app/version.sh@14 -- # tr -d '"' 00:06:33.423 11:30:02 -- app/version.sh@19 -- # patch=1 00:06:33.423 11:30:02 -- app/version.sh@20 -- # get_header_version suffix 00:06:33.423 11:30:02 -- app/version.sh@14 -- # cut -f2 00:06:33.423 11:30:02 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.423 11:30:02 -- app/version.sh@14 -- # tr -d '"' 00:06:33.423 11:30:02 -- app/version.sh@20 -- # suffix=-pre 00:06:33.423 11:30:02 -- app/version.sh@22 -- # version=24.1 00:06:33.423 11:30:02 -- app/version.sh@25 -- # (( patch != 0 )) 00:06:33.423 11:30:02 -- app/version.sh@25 -- # version=24.1.1 00:06:33.423 11:30:02 -- app/version.sh@28 -- # version=24.1.1rc0 00:06:33.423 11:30:02 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:33.423 11:30:02 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:33.423 11:30:02 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:06:33.423 11:30:02 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:06:33.423 00:06:33.423 real 0m0.147s 00:06:33.423 user 0m0.064s 00:06:33.423 sys 0m0.127s 00:06:33.423 11:30:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.423 11:30:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.423 ************************************ 00:06:33.423 END TEST version 00:06:33.423 ************************************ 00:06:33.423 11:30:02 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@204 -- # uname -s 00:06:33.423 11:30:02 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:06:33.423 11:30:02 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:06:33.423 11:30:02 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:06:33.423 11:30:02 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@268 -- # timing_exit lib 00:06:33.423 11:30:02 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:33.423 11:30:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.423 11:30:02 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:06:33.423 11:30:02 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:06:33.423 11:30:02 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:06:33.423 11:30:02 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:06:33.423 11:30:02 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:33.423 11:30:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:33.423 11:30:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:33.423 11:30:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.423 ************************************ 00:06:33.423 START TEST llvm_fuzz 00:06:33.423 ************************************ 00:06:33.423 11:30:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:33.423 * Looking for test storage... 00:06:33.683 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:06:33.683 11:30:02 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:06:33.683 11:30:02 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:06:33.683 11:30:02 -- common/autotest_common.sh@538 -- # fuzzers=() 00:06:33.683 11:30:02 -- common/autotest_common.sh@538 -- # local fuzzers 00:06:33.683 11:30:02 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:06:33.683 11:30:02 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:06:33.683 11:30:02 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:06:33.683 11:30:02 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:06:33.683 11:30:02 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:06:33.683 11:30:02 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:06:33.683 11:30:02 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:06:33.683 11:30:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:33.683 11:30:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:33.683 11:30:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:33.683 11:30:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:33.683 11:30:02 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:06:33.683 11:30:02 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:06:33.683 11:30:02 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:33.683 11:30:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:33.683 11:30:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:33.683 11:30:02 -- common/autotest_common.sh@10 -- # set +x 00:06:33.683 ************************************ 00:06:33.683 START TEST nvmf_fuzz 00:06:33.683 ************************************ 00:06:33.683 11:30:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:33.683 * Looking for test storage... 00:06:33.683 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:33.683 11:30:02 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:06:33.683 11:30:02 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:06:33.683 11:30:02 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:06:33.683 11:30:02 -- common/autotest_common.sh@34 -- # set -e 00:06:33.683 11:30:02 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:06:33.683 11:30:02 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:06:33.683 11:30:02 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:06:33.683 11:30:02 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:06:33.683 11:30:02 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:06:33.683 11:30:02 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:06:33.683 11:30:02 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:06:33.683 11:30:02 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:06:33.683 11:30:02 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:06:33.683 11:30:02 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:06:33.683 11:30:02 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:06:33.683 11:30:02 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:06:33.683 11:30:02 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:06:33.683 11:30:02 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:06:33.683 11:30:02 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:06:33.683 11:30:02 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:06:33.683 11:30:02 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:06:33.683 11:30:02 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:06:33.683 11:30:02 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:06:33.683 11:30:02 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:06:33.683 11:30:02 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:06:33.683 11:30:02 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:06:33.683 11:30:02 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:33.683 11:30:02 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:06:33.683 11:30:02 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:06:33.683 11:30:02 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:06:33.683 11:30:02 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:06:33.683 11:30:02 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:06:33.683 11:30:02 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:06:33.683 11:30:02 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:06:33.683 11:30:02 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:06:33.683 11:30:02 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:06:33.683 11:30:02 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:06:33.683 11:30:02 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:06:33.683 11:30:02 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:06:33.683 11:30:02 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:06:33.683 11:30:02 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:06:33.683 11:30:02 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:06:33.683 11:30:02 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:06:33.683 11:30:02 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:06:33.683 11:30:02 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:06:33.683 11:30:02 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:06:33.683 11:30:02 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:06:33.683 11:30:02 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:06:33.683 11:30:02 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:06:33.683 11:30:02 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:06:33.683 11:30:02 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:06:33.683 11:30:02 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:06:33.683 11:30:02 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:06:33.683 11:30:02 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:06:33.683 11:30:02 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:06:33.683 11:30:02 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:06:33.683 11:30:02 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:06:33.683 11:30:02 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:06:33.683 11:30:02 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:06:33.683 11:30:02 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:06:33.683 11:30:02 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:06:33.683 11:30:02 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:06:33.683 11:30:02 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:06:33.683 11:30:02 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:06:33.683 11:30:02 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:06:33.683 11:30:02 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:06:33.683 11:30:02 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:06:33.683 11:30:02 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:06:33.683 11:30:02 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:06:33.683 11:30:02 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:06:33.683 11:30:02 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:06:33.683 11:30:02 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:06:33.683 11:30:02 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:06:33.683 11:30:02 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:06:33.683 11:30:02 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:06:33.683 11:30:02 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:06:33.683 11:30:02 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:06:33.683 11:30:02 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:06:33.683 11:30:02 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:06:33.683 11:30:02 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:06:33.683 11:30:02 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:06:33.683 11:30:02 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:06:33.683 11:30:02 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:06:33.683 11:30:02 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:06:33.683 11:30:02 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:06:33.683 11:30:02 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:06:33.683 11:30:02 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:06:33.683 11:30:02 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:33.683 11:30:02 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:33.683 11:30:02 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:33.683 11:30:03 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:33.683 11:30:03 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:33.683 11:30:03 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:33.683 11:30:03 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:33.683 11:30:03 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:33.683 11:30:03 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:06:33.683 11:30:03 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:06:33.683 11:30:03 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:06:33.683 11:30:03 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:06:33.683 11:30:03 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:06:33.683 11:30:03 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:06:33.683 11:30:03 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:06:33.683 11:30:03 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:06:33.683 #define SPDK_CONFIG_H 00:06:33.683 #define SPDK_CONFIG_APPS 1 00:06:33.683 #define SPDK_CONFIG_ARCH native 00:06:33.683 #undef SPDK_CONFIG_ASAN 00:06:33.683 #undef SPDK_CONFIG_AVAHI 00:06:33.683 #undef SPDK_CONFIG_CET 00:06:33.683 #define SPDK_CONFIG_COVERAGE 1 00:06:33.683 #define SPDK_CONFIG_CROSS_PREFIX 00:06:33.683 #undef SPDK_CONFIG_CRYPTO 00:06:33.683 #undef SPDK_CONFIG_CRYPTO_MLX5 00:06:33.683 #undef SPDK_CONFIG_CUSTOMOCF 00:06:33.683 #undef SPDK_CONFIG_DAOS 00:06:33.683 #define SPDK_CONFIG_DAOS_DIR 00:06:33.683 #define SPDK_CONFIG_DEBUG 1 00:06:33.683 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:06:33.683 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:06:33.683 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:06:33.683 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:06:33.683 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:06:33.683 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:33.683 #define SPDK_CONFIG_EXAMPLES 1 00:06:33.683 #undef SPDK_CONFIG_FC 00:06:33.683 #define SPDK_CONFIG_FC_PATH 00:06:33.683 #define SPDK_CONFIG_FIO_PLUGIN 1 00:06:33.683 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:06:33.683 #undef SPDK_CONFIG_FUSE 00:06:33.683 #define SPDK_CONFIG_FUZZER 1 00:06:33.683 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:06:33.683 #undef SPDK_CONFIG_GOLANG 00:06:33.683 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:06:33.683 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:06:33.683 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:06:33.683 #undef SPDK_CONFIG_HAVE_LIBBSD 00:06:33.683 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:06:33.683 #define SPDK_CONFIG_IDXD 1 00:06:33.683 #define SPDK_CONFIG_IDXD_KERNEL 1 00:06:33.683 #undef SPDK_CONFIG_IPSEC_MB 00:06:33.683 #define SPDK_CONFIG_IPSEC_MB_DIR 00:06:33.683 #define SPDK_CONFIG_ISAL 1 00:06:33.683 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:06:33.683 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:06:33.683 #define SPDK_CONFIG_LIBDIR 00:06:33.683 #undef SPDK_CONFIG_LTO 00:06:33.683 #define SPDK_CONFIG_MAX_LCORES 00:06:33.683 #define SPDK_CONFIG_NVME_CUSE 1 00:06:33.683 #undef SPDK_CONFIG_OCF 00:06:33.683 #define SPDK_CONFIG_OCF_PATH 00:06:33.683 #define SPDK_CONFIG_OPENSSL_PATH 00:06:33.683 #undef SPDK_CONFIG_PGO_CAPTURE 00:06:33.683 #undef SPDK_CONFIG_PGO_USE 00:06:33.683 #define SPDK_CONFIG_PREFIX /usr/local 00:06:33.683 #undef SPDK_CONFIG_RAID5F 00:06:33.683 #undef SPDK_CONFIG_RBD 00:06:33.683 #define SPDK_CONFIG_RDMA 1 00:06:33.683 #define SPDK_CONFIG_RDMA_PROV verbs 00:06:33.683 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:06:33.683 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:06:33.683 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:06:33.683 #undef SPDK_CONFIG_SHARED 00:06:33.683 #undef SPDK_CONFIG_SMA 00:06:33.683 #define SPDK_CONFIG_TESTS 1 00:06:33.683 #undef SPDK_CONFIG_TSAN 00:06:33.683 #define SPDK_CONFIG_UBLK 1 00:06:33.683 #define SPDK_CONFIG_UBSAN 1 00:06:33.683 #undef SPDK_CONFIG_UNIT_TESTS 00:06:33.683 #undef SPDK_CONFIG_URING 00:06:33.683 #define SPDK_CONFIG_URING_PATH 00:06:33.683 #undef SPDK_CONFIG_URING_ZNS 00:06:33.683 #undef SPDK_CONFIG_USDT 00:06:33.683 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:06:33.683 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:06:33.683 #define SPDK_CONFIG_VFIO_USER 1 00:06:33.683 #define SPDK_CONFIG_VFIO_USER_DIR 00:06:33.683 #define SPDK_CONFIG_VHOST 1 00:06:33.683 #define SPDK_CONFIG_VIRTIO 1 00:06:33.683 #undef SPDK_CONFIG_VTUNE 00:06:33.683 #define SPDK_CONFIG_VTUNE_DIR 00:06:33.683 #define SPDK_CONFIG_WERROR 1 00:06:33.683 #define SPDK_CONFIG_WPDK_DIR 00:06:33.683 #undef SPDK_CONFIG_XNVME 00:06:33.683 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:06:33.683 11:30:03 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:06:33.683 11:30:03 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:33.683 11:30:03 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:33.683 11:30:03 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:33.684 11:30:03 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:33.684 11:30:03 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:33.684 11:30:03 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:33.684 11:30:03 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:33.684 11:30:03 -- paths/export.sh@5 -- # export PATH 00:06:33.684 11:30:03 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:33.684 11:30:03 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:33.684 11:30:03 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:33.684 11:30:03 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:33.684 11:30:03 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:33.684 11:30:03 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:06:33.684 11:30:03 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:33.684 11:30:03 -- pm/common@16 -- # TEST_TAG=N/A 00:06:33.684 11:30:03 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:06:33.684 11:30:03 -- common/autotest_common.sh@52 -- # : 1 00:06:33.684 11:30:03 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:06:33.684 11:30:03 -- common/autotest_common.sh@56 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:06:33.684 11:30:03 -- common/autotest_common.sh@58 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:06:33.684 11:30:03 -- common/autotest_common.sh@60 -- # : 1 00:06:33.684 11:30:03 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:06:33.684 11:30:03 -- common/autotest_common.sh@62 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:06:33.684 11:30:03 -- common/autotest_common.sh@64 -- # : 00:06:33.684 11:30:03 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:06:33.684 11:30:03 -- common/autotest_common.sh@66 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:06:33.684 11:30:03 -- common/autotest_common.sh@68 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:06:33.684 11:30:03 -- common/autotest_common.sh@70 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:06:33.684 11:30:03 -- common/autotest_common.sh@72 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:06:33.684 11:30:03 -- common/autotest_common.sh@74 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:06:33.684 11:30:03 -- common/autotest_common.sh@76 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:06:33.684 11:30:03 -- common/autotest_common.sh@78 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:06:33.684 11:30:03 -- common/autotest_common.sh@80 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:06:33.684 11:30:03 -- common/autotest_common.sh@82 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:06:33.684 11:30:03 -- common/autotest_common.sh@84 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:06:33.684 11:30:03 -- common/autotest_common.sh@86 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:06:33.684 11:30:03 -- common/autotest_common.sh@88 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:06:33.684 11:30:03 -- common/autotest_common.sh@90 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:06:33.684 11:30:03 -- common/autotest_common.sh@92 -- # : 1 00:06:33.684 11:30:03 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:06:33.684 11:30:03 -- common/autotest_common.sh@94 -- # : 1 00:06:33.684 11:30:03 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:06:33.684 11:30:03 -- common/autotest_common.sh@96 -- # : rdma 00:06:33.684 11:30:03 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:06:33.684 11:30:03 -- common/autotest_common.sh@98 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:06:33.684 11:30:03 -- common/autotest_common.sh@100 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:06:33.684 11:30:03 -- common/autotest_common.sh@102 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:06:33.684 11:30:03 -- common/autotest_common.sh@104 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:06:33.684 11:30:03 -- common/autotest_common.sh@106 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:06:33.684 11:30:03 -- common/autotest_common.sh@108 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:06:33.684 11:30:03 -- common/autotest_common.sh@110 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:06:33.684 11:30:03 -- common/autotest_common.sh@112 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:06:33.684 11:30:03 -- common/autotest_common.sh@114 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:06:33.684 11:30:03 -- common/autotest_common.sh@116 -- # : 1 00:06:33.684 11:30:03 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:06:33.684 11:30:03 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:06:33.684 11:30:03 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:06:33.684 11:30:03 -- common/autotest_common.sh@120 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:06:33.684 11:30:03 -- common/autotest_common.sh@122 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:06:33.684 11:30:03 -- common/autotest_common.sh@124 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:06:33.684 11:30:03 -- common/autotest_common.sh@126 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:06:33.684 11:30:03 -- common/autotest_common.sh@128 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:06:33.684 11:30:03 -- common/autotest_common.sh@130 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:06:33.684 11:30:03 -- common/autotest_common.sh@132 -- # : v23.11 00:06:33.684 11:30:03 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:06:33.684 11:30:03 -- common/autotest_common.sh@134 -- # : true 00:06:33.684 11:30:03 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:06:33.684 11:30:03 -- common/autotest_common.sh@136 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:06:33.684 11:30:03 -- common/autotest_common.sh@138 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:06:33.684 11:30:03 -- common/autotest_common.sh@140 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:06:33.684 11:30:03 -- common/autotest_common.sh@142 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:06:33.684 11:30:03 -- common/autotest_common.sh@144 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:06:33.684 11:30:03 -- common/autotest_common.sh@146 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:06:33.684 11:30:03 -- common/autotest_common.sh@148 -- # : 00:06:33.684 11:30:03 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:06:33.684 11:30:03 -- common/autotest_common.sh@150 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:06:33.684 11:30:03 -- common/autotest_common.sh@152 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:06:33.684 11:30:03 -- common/autotest_common.sh@154 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:06:33.684 11:30:03 -- common/autotest_common.sh@156 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:06:33.684 11:30:03 -- common/autotest_common.sh@158 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:06:33.684 11:30:03 -- common/autotest_common.sh@160 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:06:33.684 11:30:03 -- common/autotest_common.sh@163 -- # : 00:06:33.684 11:30:03 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:06:33.684 11:30:03 -- common/autotest_common.sh@165 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:06:33.684 11:30:03 -- common/autotest_common.sh@167 -- # : 0 00:06:33.684 11:30:03 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:06:33.684 11:30:03 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:33.684 11:30:03 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:06:33.684 11:30:03 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:06:33.684 11:30:03 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:33.684 11:30:03 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:33.684 11:30:03 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:06:33.684 11:30:03 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:06:33.684 11:30:03 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:33.684 11:30:03 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:33.684 11:30:03 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:33.684 11:30:03 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:33.684 11:30:03 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:06:33.684 11:30:03 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:06:33.684 11:30:03 -- common/autotest_common.sh@196 -- # cat 00:06:33.684 11:30:03 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:06:33.684 11:30:03 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:33.684 11:30:03 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:33.684 11:30:03 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:33.684 11:30:03 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:33.684 11:30:03 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:06:33.684 11:30:03 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:06:33.684 11:30:03 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:33.684 11:30:03 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:33.684 11:30:03 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:33.684 11:30:03 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:33.684 11:30:03 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:33.684 11:30:03 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:33.684 11:30:03 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:33.684 11:30:03 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:33.684 11:30:03 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:33.684 11:30:03 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:33.684 11:30:03 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:33.684 11:30:03 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:33.684 11:30:03 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:06:33.684 11:30:03 -- common/autotest_common.sh@249 -- # export valgrind= 00:06:33.684 11:30:03 -- common/autotest_common.sh@249 -- # valgrind= 00:06:33.684 11:30:03 -- common/autotest_common.sh@255 -- # uname -s 00:06:33.684 11:30:03 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:06:33.684 11:30:03 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:06:33.684 11:30:03 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:06:33.684 11:30:03 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:06:33.684 11:30:03 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:06:33.684 11:30:03 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:06:33.684 11:30:03 -- common/autotest_common.sh@265 -- # MAKE=make 00:06:33.684 11:30:03 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:06:33.684 11:30:03 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:06:33.684 11:30:03 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:06:33.684 11:30:03 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:06:33.684 11:30:03 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:06:33.684 11:30:03 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:06:33.684 11:30:03 -- common/autotest_common.sh@309 -- # [[ -z 2058592 ]] 00:06:33.684 11:30:03 -- common/autotest_common.sh@309 -- # kill -0 2058592 00:06:33.684 11:30:03 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:06:33.684 11:30:03 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:06:33.684 11:30:03 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:06:33.684 11:30:03 -- common/autotest_common.sh@322 -- # local mount target_dir 00:06:33.684 11:30:03 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:06:33.684 11:30:03 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:06:33.684 11:30:03 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:06:33.684 11:30:03 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:06:33.684 11:30:03 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.wUGfvz 00:06:33.684 11:30:03 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:06:33.684 11:30:03 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:06:33.684 11:30:03 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:06:33.684 11:30:03 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.wUGfvz/tests/nvmf /tmp/spdk.wUGfvz 00:06:33.684 11:30:03 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:06:33.685 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.685 11:30:03 -- common/autotest_common.sh@318 -- # df -T 00:06:33.685 11:30:03 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:06:33.685 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:06:33.685 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:06:33.685 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:06:33.685 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=47421149184 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:06:33.685 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=14321168384 00:06:33.685 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:06:33.685 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:06:33.685 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:06:33.685 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:33.685 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:33.943 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342489088 00:06:33.943 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:06:33.943 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=5976064 00:06:33.943 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.943 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:33.943 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:33.943 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868680704 00:06:33.943 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:06:33.943 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=2478080 00:06:33.943 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.943 11:30:03 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:06:33.943 11:30:03 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:06:33.943 11:30:03 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:06:33.943 11:30:03 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:06:33.943 11:30:03 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:06:33.943 11:30:03 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:06:33.943 11:30:03 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:06:33.943 * Looking for test storage... 00:06:33.943 11:30:03 -- common/autotest_common.sh@359 -- # local target_space new_size 00:06:33.943 11:30:03 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:06:33.943 11:30:03 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:33.943 11:30:03 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:06:33.943 11:30:03 -- common/autotest_common.sh@363 -- # mount=/ 00:06:33.943 11:30:03 -- common/autotest_common.sh@365 -- # target_space=47421149184 00:06:33.943 11:30:03 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:06:33.943 11:30:03 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:06:33.943 11:30:03 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:06:33.943 11:30:03 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:06:33.943 11:30:03 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:06:33.943 11:30:03 -- common/autotest_common.sh@372 -- # new_size=16535760896 00:06:33.943 11:30:03 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:06:33.943 11:30:03 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:33.943 11:30:03 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:33.943 11:30:03 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:33.943 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:33.943 11:30:03 -- common/autotest_common.sh@380 -- # return 0 00:06:33.943 11:30:03 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:06:33.943 11:30:03 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:06:33.943 11:30:03 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:06:33.943 11:30:03 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:06:33.943 11:30:03 -- common/autotest_common.sh@1672 -- # true 00:06:33.943 11:30:03 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:06:33.943 11:30:03 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:06:33.943 11:30:03 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:06:33.943 11:30:03 -- common/autotest_common.sh@27 -- # exec 00:06:33.943 11:30:03 -- common/autotest_common.sh@29 -- # exec 00:06:33.943 11:30:03 -- common/autotest_common.sh@31 -- # xtrace_restore 00:06:33.943 11:30:03 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:06:33.943 11:30:03 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:06:33.943 11:30:03 -- common/autotest_common.sh@18 -- # set -x 00:06:33.943 11:30:03 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:06:33.943 11:30:03 -- ../common.sh@8 -- # pids=() 00:06:33.943 11:30:03 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:33.943 11:30:03 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:33.943 11:30:03 -- nvmf/run.sh@56 -- # fuzz_num=25 00:06:33.943 11:30:03 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:06:33.943 11:30:03 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:06:33.943 11:30:03 -- nvmf/run.sh@61 -- # mem_size=512 00:06:33.943 11:30:03 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:06:33.943 11:30:03 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:06:33.943 11:30:03 -- ../common.sh@69 -- # local fuzz_num=25 00:06:33.943 11:30:03 -- ../common.sh@70 -- # local time=1 00:06:33.943 11:30:03 -- ../common.sh@72 -- # (( i = 0 )) 00:06:33.943 11:30:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:33.943 11:30:03 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:06:33.943 11:30:03 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:06:33.943 11:30:03 -- nvmf/run.sh@24 -- # local timen=1 00:06:33.943 11:30:03 -- nvmf/run.sh@25 -- # local core=0x1 00:06:33.943 11:30:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:33.943 11:30:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:06:33.943 11:30:03 -- nvmf/run.sh@29 -- # printf %02d 0 00:06:33.943 11:30:03 -- nvmf/run.sh@29 -- # port=4400 00:06:33.943 11:30:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:33.943 11:30:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:06:33.943 11:30:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:33.943 11:30:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:06:33.943 [2024-07-21 11:30:03.182754] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:33.943 [2024-07-21 11:30:03.182844] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2058753 ] 00:06:33.943 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.201 [2024-07-21 11:30:03.437904] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.202 [2024-07-21 11:30:03.467585] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:34.202 [2024-07-21 11:30:03.467737] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.202 [2024-07-21 11:30:03.519592] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:34.202 [2024-07-21 11:30:03.535957] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:06:34.202 INFO: Running with entropic power schedule (0xFF, 100). 00:06:34.202 INFO: Seed: 1280821871 00:06:34.202 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:34.202 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:34.202 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:34.202 INFO: A corpus is not provided, starting from an empty corpus 00:06:34.202 #2 INITED exec/s: 0 rss: 59Mb 00:06:34.202 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:34.202 This may also happen if the target rejected all inputs we tried so far 00:06:34.202 [2024-07-21 11:30:03.612672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:34.202 [2024-07-21 11:30:03.612714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.718 NEW_FUNC[1/670]: 0x49e700 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:06:34.718 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:34.718 #15 NEW cov: 11472 ft: 11473 corp: 2/68b lim: 320 exec/s: 0 rss: 65Mb L: 67/67 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:06:34.718 [2024-07-21 11:30:03.953018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:34.718 [2024-07-21 11:30:03.953068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.718 #16 NEW cov: 11585 ft: 12116 corp: 3/135b lim: 320 exec/s: 0 rss: 65Mb L: 67/67 MS: 1 ShuffleBytes- 00:06:34.718 [2024-07-21 11:30:04.013086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:34.718 [2024-07-21 11:30:04.013118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.718 #22 NEW cov: 11591 ft: 12306 corp: 4/202b lim: 320 exec/s: 0 rss: 65Mb L: 67/67 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\011"- 00:06:34.718 [2024-07-21 11:30:04.063482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:34.718 [2024-07-21 11:30:04.063518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.718 [2024-07-21 11:30:04.063670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:00000000 cdw11:00000000 00:06:34.718 [2024-07-21 11:30:04.063688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:34.718 #23 NEW cov: 11699 ft: 12786 corp: 5/348b lim: 320 exec/s: 0 rss: 65Mb L: 146/146 MS: 1 InsertRepeatedBytes- 00:06:34.718 [2024-07-21 11:30:04.123679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:34.718 [2024-07-21 11:30:04.123711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.718 [2024-07-21 11:30:04.123851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:00000000 cdw11:00000000 00:06:34.718 [2024-07-21 11:30:04.123869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:34.975 #24 NEW cov: 11699 ft: 12870 corp: 6/494b lim: 320 exec/s: 0 rss: 66Mb L: 146/146 MS: 1 ShuffleBytes- 00:06:34.975 [2024-07-21 11:30:04.173575] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:34.975 [2024-07-21 11:30:04.173607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.975 #25 NEW cov: 11699 ft: 13000 corp: 7/561b lim: 320 exec/s: 0 rss: 66Mb L: 67/146 MS: 1 ChangeBinInt- 00:06:34.975 [2024-07-21 11:30:04.223980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:34.975 [2024-07-21 11:30:04.224011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.975 [2024-07-21 11:30:04.224160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:000009ff cdw11:00000000 00:06:34.975 [2024-07-21 11:30:04.224180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:34.975 #26 NEW cov: 11699 ft: 13053 corp: 8/715b lim: 320 exec/s: 0 rss: 66Mb L: 154/154 MS: 1 CMP- DE: "\377.\217e\370\241=\276"- 00:06:34.975 [2024-07-21 11:30:04.274129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:34.975 [2024-07-21 11:30:04.274160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.976 [2024-07-21 11:30:04.274292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:00000000 cdw11:00000000 00:06:34.976 [2024-07-21 11:30:04.274309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:34.976 #27 NEW cov: 11699 ft: 13123 corp: 9/869b lim: 320 exec/s: 0 rss: 66Mb L: 154/154 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\011"- 00:06:34.976 [2024-07-21 11:30:04.324118] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:34.976 [2024-07-21 11:30:04.324151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:34.976 #28 NEW cov: 11699 ft: 13141 corp: 10/936b lim: 320 exec/s: 0 rss: 66Mb L: 67/154 MS: 1 ChangeBit- 00:06:34.976 [2024-07-21 11:30:04.374312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:34.976 [2024-07-21 11:30:04.374346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.234 #29 NEW cov: 11699 ft: 13155 corp: 11/1003b lim: 320 exec/s: 0 rss: 66Mb L: 67/154 MS: 1 ChangeBit- 00:06:35.234 [2024-07-21 11:30:04.424419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:35.234 [2024-07-21 11:30:04.424455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.234 #30 NEW cov: 11699 ft: 13162 corp: 12/1103b lim: 320 exec/s: 0 rss: 66Mb L: 100/154 MS: 1 CrossOver- 00:06:35.234 [2024-07-21 11:30:04.474873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.234 [2024-07-21 11:30:04.474905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.234 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:35.234 #31 NEW cov: 11727 ft: 13286 corp: 13/1237b lim: 320 exec/s: 0 rss: 67Mb L: 134/154 MS: 1 CrossOver- 00:06:35.234 [2024-07-21 11:30:04.534721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:35.234 [2024-07-21 11:30:04.534751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.234 #32 NEW cov: 11727 ft: 13304 corp: 14/1304b lim: 320 exec/s: 0 rss: 67Mb L: 67/154 MS: 1 CrossOver- 00:06:35.234 [2024-07-21 11:30:04.585135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.234 [2024-07-21 11:30:04.585168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.234 [2024-07-21 11:30:04.585316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (be) qid:0 cid:5 nsid:40404040 cdw10:000009ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x40 00:06:35.234 [2024-07-21 11:30:04.585335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.234 #33 NEW cov: 11727 ft: 13427 corp: 15/1458b lim: 320 exec/s: 33 rss: 67Mb L: 154/154 MS: 1 PersAutoDict- DE: "\377.\217e\370\241=\276"- 00:06:35.234 [2024-07-21 11:30:04.645329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.234 [2024-07-21 11:30:04.645364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.234 [2024-07-21 11:30:04.645479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:ffffffff cdw11:00000009 00:06:35.234 [2024-07-21 11:30:04.645503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.492 #39 NEW cov: 11727 ft: 13460 corp: 16/1615b lim: 320 exec/s: 39 rss: 67Mb L: 157/157 MS: 1 InsertRepeatedBytes- 00:06:35.492 [2024-07-21 11:30:04.695431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.492 [2024-07-21 11:30:04.695468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.492 [2024-07-21 11:30:04.695611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:ffffffff cdw11:00000009 00:06:35.492 [2024-07-21 11:30:04.695631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.492 #40 NEW cov: 11727 ft: 13487 corp: 17/1773b lim: 320 exec/s: 40 rss: 67Mb L: 158/158 MS: 1 InsertByte- 00:06:35.492 [2024-07-21 11:30:04.755588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.492 [2024-07-21 11:30:04.755625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.492 [2024-07-21 11:30:04.755770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:00000000 cdw11:00000000 00:06:35.492 [2024-07-21 11:30:04.755788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.492 #41 NEW cov: 11727 ft: 13509 corp: 18/1919b lim: 320 exec/s: 41 rss: 67Mb L: 146/158 MS: 1 CrossOver- 00:06:35.492 [2024-07-21 11:30:04.805910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.492 [2024-07-21 11:30:04.805944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.492 [2024-07-21 11:30:04.806082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:ffffffff cdw11:000009ff 00:06:35.492 [2024-07-21 11:30:04.806101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.492 #42 NEW cov: 11727 ft: 13530 corp: 19/2078b lim: 320 exec/s: 42 rss: 67Mb L: 159/159 MS: 1 InsertByte- 00:06:35.492 [2024-07-21 11:30:04.865792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:35.492 [2024-07-21 11:30:04.865825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.492 #43 NEW cov: 11727 ft: 13557 corp: 20/2145b lim: 320 exec/s: 43 rss: 67Mb L: 67/159 MS: 1 ChangeBinInt- 00:06:35.492 [2024-07-21 11:30:04.916108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1000001 00:06:35.492 [2024-07-21 11:30:04.916141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.750 #44 NEW cov: 11727 ft: 13588 corp: 21/2212b lim: 320 exec/s: 44 rss: 67Mb L: 67/159 MS: 1 ChangeBinInt- 00:06:35.750 [2024-07-21 11:30:04.966349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.750 [2024-07-21 11:30:04.966383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.750 [2024-07-21 11:30:04.966524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (be) qid:0 cid:5 nsid:40404040 cdw10:000009ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x40 00:06:35.750 [2024-07-21 11:30:04.966543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.750 #45 NEW cov: 11727 ft: 13605 corp: 22/2366b lim: 320 exec/s: 45 rss: 67Mb L: 154/159 MS: 1 ShuffleBytes- 00:06:35.750 [2024-07-21 11:30:05.026268] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9ffff 00:06:35.750 [2024-07-21 11:30:05.026298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.750 #46 NEW cov: 11727 ft: 13648 corp: 23/2441b lim: 320 exec/s: 46 rss: 67Mb L: 75/159 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\011"- 00:06:35.750 [2024-07-21 11:30:05.076753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.750 [2024-07-21 11:30:05.076784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.750 [2024-07-21 11:30:05.076919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:ffffff00 cdw11:ffffffff 00:06:35.750 [2024-07-21 11:30:05.076938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.750 #47 NEW cov: 11727 ft: 13654 corp: 24/2603b lim: 320 exec/s: 47 rss: 67Mb L: 162/162 MS: 1 CMP- DE: "\000\000\000\037"- 00:06:35.750 [2024-07-21 11:30:05.126772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:35.750 [2024-07-21 11:30:05.126804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.750 [2024-07-21 11:30:05.126945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:ffffffff cdw11:000009ff 00:06:35.750 [2024-07-21 11:30:05.126963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.750 #48 NEW cov: 11727 ft: 13663 corp: 25/2762b lim: 320 exec/s: 48 rss: 67Mb L: 159/162 MS: 1 InsertByte- 00:06:36.009 [2024-07-21 11:30:05.176751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x200 00:06:36.009 [2024-07-21 11:30:05.176781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.009 #54 NEW cov: 11727 ft: 13674 corp: 26/2829b lim: 320 exec/s: 54 rss: 67Mb L: 67/162 MS: 1 ChangeBit- 00:06:36.009 [2024-07-21 11:30:05.227146] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:42424242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x420000000009ffff 00:06:36.009 [2024-07-21 11:30:05.227177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.009 [2024-07-21 11:30:05.227332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (42) qid:0 cid:5 nsid:42424242 cdw10:42424242 cdw11:42424242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4242424242424242 00:06:36.009 [2024-07-21 11:30:05.227349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.009 #55 NEW cov: 11727 ft: 13706 corp: 27/3009b lim: 320 exec/s: 55 rss: 67Mb L: 180/180 MS: 1 InsertRepeatedBytes- 00:06:36.009 [2024-07-21 11:30:05.287164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:36.009 [2024-07-21 11:30:05.287196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.009 #56 NEW cov: 11727 ft: 13720 corp: 28/3076b lim: 320 exec/s: 56 rss: 67Mb L: 67/180 MS: 1 ChangeBit- 00:06:36.009 [2024-07-21 11:30:05.337531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:36.009 [2024-07-21 11:30:05.337562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.009 [2024-07-21 11:30:05.337703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:00000000 cdw11:00000000 00:06:36.009 [2024-07-21 11:30:05.337719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.009 #57 NEW cov: 11727 ft: 13735 corp: 29/3222b lim: 320 exec/s: 57 rss: 67Mb L: 146/180 MS: 1 ShuffleBytes- 00:06:36.009 [2024-07-21 11:30:05.387658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:36.009 [2024-07-21 11:30:05.387691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.009 [2024-07-21 11:30:05.387828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:ffffffff cdw11:000009ff 00:06:36.009 [2024-07-21 11:30:05.387844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.009 #58 NEW cov: 11727 ft: 13741 corp: 30/3381b lim: 320 exec/s: 58 rss: 68Mb L: 159/180 MS: 1 ChangeByte- 00:06:36.276 [2024-07-21 11:30:05.437910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:36.276 [2024-07-21 11:30:05.437941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.276 [2024-07-21 11:30:05.438086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (be) qid:0 cid:5 nsid:40404040 cdw10:000009ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x40 00:06:36.276 [2024-07-21 11:30:05.438105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.276 #59 NEW cov: 11727 ft: 13756 corp: 31/3535b lim: 320 exec/s: 59 rss: 68Mb L: 154/180 MS: 1 ShuffleBytes- 00:06:36.276 [2024-07-21 11:30:05.498085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:36.276 [2024-07-21 11:30:05.498117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.276 [2024-07-21 11:30:05.498261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (be) qid:0 cid:5 nsid:40404040 cdw10:000009ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x40 00:06:36.276 [2024-07-21 11:30:05.498278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.276 #60 NEW cov: 11727 ft: 13775 corp: 32/3689b lim: 320 exec/s: 60 rss: 68Mb L: 154/180 MS: 1 ShuffleBytes- 00:06:36.276 [2024-07-21 11:30:05.548283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4040404040404040 00:06:36.276 [2024-07-21 11:30:05.548315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.276 [2024-07-21 11:30:05.548428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (40) qid:0 cid:5 nsid:40404040 cdw10:00000000 cdw11:00000000 00:06:36.276 [2024-07-21 11:30:05.548448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.276 #61 NEW cov: 11727 ft: 13789 corp: 33/3835b lim: 320 exec/s: 61 rss: 68Mb L: 146/180 MS: 1 CopyPart- 00:06:36.276 [2024-07-21 11:30:05.598128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00020000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x200 00:06:36.276 [2024-07-21 11:30:05.598158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.276 #62 NEW cov: 11727 ft: 13793 corp: 34/3902b lim: 320 exec/s: 31 rss: 68Mb L: 67/180 MS: 1 ChangeBit- 00:06:36.276 #62 DONE cov: 11727 ft: 13793 corp: 34/3902b lim: 320 exec/s: 31 rss: 68Mb 00:06:36.276 ###### Recommended dictionary. ###### 00:06:36.276 "\377\377\377\377\377\377\377\011" # Uses: 2 00:06:36.276 "\377.\217e\370\241=\276" # Uses: 3 00:06:36.276 "\000\000\000\037" # Uses: 1 00:06:36.276 ###### End of recommended dictionary. ###### 00:06:36.276 Done 62 runs in 2 second(s) 00:06:36.535 11:30:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:06:36.535 11:30:05 -- ../common.sh@72 -- # (( i++ )) 00:06:36.535 11:30:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:36.535 11:30:05 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:06:36.535 11:30:05 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:06:36.535 11:30:05 -- nvmf/run.sh@24 -- # local timen=1 00:06:36.535 11:30:05 -- nvmf/run.sh@25 -- # local core=0x1 00:06:36.535 11:30:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:36.535 11:30:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:06:36.535 11:30:05 -- nvmf/run.sh@29 -- # printf %02d 1 00:06:36.535 11:30:05 -- nvmf/run.sh@29 -- # port=4401 00:06:36.535 11:30:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:36.535 11:30:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:06:36.535 11:30:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:36.535 11:30:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:06:36.535 [2024-07-21 11:30:05.786391] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:36.535 [2024-07-21 11:30:05.786486] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2059541 ] 00:06:36.535 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.794 [2024-07-21 11:30:06.031516] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.794 [2024-07-21 11:30:06.058893] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:36.794 [2024-07-21 11:30:06.059026] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.794 [2024-07-21 11:30:06.110909] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:36.794 [2024-07-21 11:30:06.127273] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:06:36.794 INFO: Running with entropic power schedule (0xFF, 100). 00:06:36.794 INFO: Seed: 3872811727 00:06:36.794 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:36.794 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:36.794 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:36.794 INFO: A corpus is not provided, starting from an empty corpus 00:06:36.794 #2 INITED exec/s: 0 rss: 60Mb 00:06:36.794 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:36.794 This may also happen if the target rejected all inputs we tried so far 00:06:36.794 [2024-07-21 11:30:06.203744] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:36.794 [2024-07-21 11:30:06.204014] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:36.794 [2024-07-21 11:30:06.204264] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:36.794 [2024-07-21 11:30:06.204739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:36.794 [2024-07-21 11:30:06.204782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.794 [2024-07-21 11:30:06.204862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:36.794 [2024-07-21 11:30:06.204880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.794 [2024-07-21 11:30:06.204953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:36.794 [2024-07-21 11:30:06.204975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.312 NEW_FUNC[1/671]: 0x49f000 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:06:37.312 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:37.312 #8 NEW cov: 11553 ft: 11554 corp: 2/21b lim: 30 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:06:37.312 [2024-07-21 11:30:06.534071] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.534260] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.534407] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.534777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:249f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.534847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.312 [2024-07-21 11:30:06.534980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.535006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.312 [2024-07-21 11:30:06.535136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.535160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.312 #11 NEW cov: 11666 ft: 12192 corp: 3/44b lim: 30 exec/s: 0 rss: 68Mb L: 23/23 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:06:37.312 [2024-07-21 11:30:06.573807] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.574160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.574190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.312 #12 NEW cov: 11672 ft: 12940 corp: 4/53b lim: 30 exec/s: 0 rss: 68Mb L: 9/23 MS: 1 CrossOver- 00:06:37.312 [2024-07-21 11:30:06.614160] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:37.312 [2024-07-21 11:30:06.614297] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.312 [2024-07-21 11:30:06.614448] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.312 [2024-07-21 11:30:06.614807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.614835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.312 [2024-07-21 11:30:06.614955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.614972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.312 [2024-07-21 11:30:06.615088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.615104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.312 #13 NEW cov: 11757 ft: 13178 corp: 5/73b lim: 30 exec/s: 0 rss: 68Mb L: 20/23 MS: 1 ChangeBit- 00:06:37.312 [2024-07-21 11:30:06.654015] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.654357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.654385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.312 #14 NEW cov: 11757 ft: 13238 corp: 6/82b lim: 30 exec/s: 0 rss: 68Mb L: 9/23 MS: 1 ChangeBinInt- 00:06:37.312 [2024-07-21 11:30:06.694217] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.694561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f8367 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.694594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.312 #15 NEW cov: 11757 ft: 13278 corp: 7/91b lim: 30 exec/s: 0 rss: 69Mb L: 9/23 MS: 1 ShuffleBytes- 00:06:37.312 [2024-07-21 11:30:06.734345] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.312 [2024-07-21 11:30:06.734678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f8367 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.312 [2024-07-21 11:30:06.734706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.572 #16 NEW cov: 11757 ft: 13343 corp: 8/101b lim: 30 exec/s: 0 rss: 69Mb L: 10/23 MS: 1 InsertByte- 00:06:37.572 [2024-07-21 11:30:06.774765] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.572 [2024-07-21 11:30:06.774929] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.572 [2024-07-21 11:30:06.775065] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.572 [2024-07-21 11:30:06.775201] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.572 [2024-07-21 11:30:06.775342] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f23 00:06:37.572 [2024-07-21 11:30:06.775685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:249f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.775714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.775826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.775843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.775957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.775975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.776088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.776106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.776216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.776233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:37.572 #17 NEW cov: 11757 ft: 13878 corp: 9/131b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 CopyPart- 00:06:37.572 [2024-07-21 11:30:06.824719] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.572 [2024-07-21 11:30:06.824891] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:06:37.572 [2024-07-21 11:30:06.825056] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.572 [2024-07-21 11:30:06.825384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.825413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.825536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.825560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.825669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.825686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.572 #18 NEW cov: 11757 ft: 13946 corp: 10/151b lim: 30 exec/s: 0 rss: 69Mb L: 20/30 MS: 1 CopyPart- 00:06:37.572 [2024-07-21 11:30:06.864922] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.572 [2024-07-21 11:30:06.865066] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786432) > buf size (4096) 00:06:37.572 [2024-07-21 11:30:06.865204] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:06:37.572 [2024-07-21 11:30:06.865353] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.572 [2024-07-21 11:30:06.865681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.865711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.865824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.865844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.865963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.865983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.572 [2024-07-21 11:30:06.866092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.866108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:37.572 #19 NEW cov: 11780 ft: 13997 corp: 11/179b lim: 30 exec/s: 0 rss: 69Mb L: 28/30 MS: 1 CMP- DE: "\006\000\000\000\000\000\000\000"- 00:06:37.572 [2024-07-21 11:30:06.904808] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.572 [2024-07-21 11:30:06.905116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.572 [2024-07-21 11:30:06.905146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.573 #20 NEW cov: 11780 ft: 14003 corp: 12/190b lim: 30 exec/s: 0 rss: 69Mb L: 11/30 MS: 1 CrossOver- 00:06:37.573 [2024-07-21 11:30:06.945188] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:06:37.573 [2024-07-21 11:30:06.945490] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:06:37.573 [2024-07-21 11:30:06.945635] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.573 [2024-07-21 11:30:06.945965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff0006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.573 [2024-07-21 11:30:06.945992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.573 [2024-07-21 11:30:06.946111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.573 [2024-07-21 11:30:06.946127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.573 [2024-07-21 11:30:06.946243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.573 [2024-07-21 11:30:06.946258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.573 [2024-07-21 11:30:06.946376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.573 [2024-07-21 11:30:06.946396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:37.573 #21 NEW cov: 11797 ft: 14101 corp: 13/218b lim: 30 exec/s: 0 rss: 69Mb L: 28/30 MS: 1 CopyPart- 00:06:37.573 [2024-07-21 11:30:06.985115] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x9f9f 00:06:37.573 [2024-07-21 11:30:06.985447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f0062 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.573 [2024-07-21 11:30:06.985477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.832 #22 NEW cov: 11797 ft: 14148 corp: 14/227b lim: 30 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 ChangeBinInt- 00:06:37.832 [2024-07-21 11:30:07.025353] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:37.832 [2024-07-21 11:30:07.025536] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.833 [2024-07-21 11:30:07.025683] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:06:37.833 [2024-07-21 11:30:07.026002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.026032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.026148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.026165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.026283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.026301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.833 #23 NEW cov: 11797 ft: 14171 corp: 15/250b lim: 30 exec/s: 0 rss: 69Mb L: 23/30 MS: 1 InsertRepeatedBytes- 00:06:37.833 [2024-07-21 11:30:07.065584] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:37.833 [2024-07-21 11:30:07.065742] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:37.833 [2024-07-21 11:30:07.065890] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:37.833 [2024-07-21 11:30:07.066044] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:37.833 [2024-07-21 11:30:07.066361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.066390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.066503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f819f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.066519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.066628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.066647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.066765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.066783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:37.833 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:37.833 #24 NEW cov: 11820 ft: 14215 corp: 16/274b lim: 30 exec/s: 0 rss: 70Mb L: 24/30 MS: 1 InsertRepeatedBytes- 00:06:37.833 [2024-07-21 11:30:07.105712] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535168) > buf size (4096) 00:06:37.833 [2024-07-21 11:30:07.105883] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x67 00:06:37.833 [2024-07-21 11:30:07.106033] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009f9f 00:06:37.833 [2024-07-21 11:30:07.106366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f029f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.106393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.106512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.106530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.106644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9f9f029f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.106662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.833 #25 NEW cov: 11820 ft: 14329 corp: 17/293b lim: 30 exec/s: 0 rss: 70Mb L: 19/30 MS: 1 PersAutoDict- DE: "\006\000\000\000\000\000\000\000"- 00:06:37.833 [2024-07-21 11:30:07.145683] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:37.833 [2024-07-21 11:30:07.145849] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff2a 00:06:37.833 [2024-07-21 11:30:07.146008] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.833 [2024-07-21 11:30:07.146343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.146370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.146483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.146500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.146625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.146643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.833 #26 NEW cov: 11820 ft: 14371 corp: 18/313b lim: 30 exec/s: 0 rss: 70Mb L: 20/30 MS: 1 ChangeByte- 00:06:37.833 [2024-07-21 11:30:07.185832] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535104) > buf size (4096) 00:06:37.833 [2024-07-21 11:30:07.185994] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x67 00:06:37.833 [2024-07-21 11:30:07.186142] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009f9f 00:06:37.833 [2024-07-21 11:30:07.186471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8f029f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.186499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.186617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.186634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.186753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9f9f029f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.186772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.833 #27 NEW cov: 11820 ft: 14384 corp: 19/332b lim: 30 exec/s: 27 rss: 70Mb L: 19/30 MS: 1 ChangeBit- 00:06:37.833 [2024-07-21 11:30:07.226001] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:06:37.833 [2024-07-21 11:30:07.226295] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:06:37.833 [2024-07-21 11:30:07.226455] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:37.833 [2024-07-21 11:30:07.226776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff0006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.226804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.226922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.226941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.227064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.227085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:37.833 [2024-07-21 11:30:07.227200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.833 [2024-07-21 11:30:07.227218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:37.833 #28 NEW cov: 11820 ft: 14390 corp: 20/360b lim: 30 exec/s: 28 rss: 70Mb L: 28/30 MS: 1 ChangeByte- 00:06:38.093 [2024-07-21 11:30:07.265994] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.093 [2024-07-21 11:30:07.266352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f8367 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.266380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.093 #29 NEW cov: 11820 ft: 14417 corp: 21/368b lim: 30 exec/s: 29 rss: 70Mb L: 8/30 MS: 1 CrossOver- 00:06:38.093 [2024-07-21 11:30:07.306119] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (6148) > buf size (4096) 00:06:38.093 [2024-07-21 11:30:07.306281] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:06:38.093 [2024-07-21 11:30:07.306626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:06000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.306655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.093 [2024-07-21 11:30:07.306783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.306801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.093 #30 NEW cov: 11820 ft: 14673 corp: 22/385b lim: 30 exec/s: 30 rss: 70Mb L: 17/30 MS: 1 PersAutoDict- DE: "\006\000\000\000\000\000\000\000"- 00:06:38.093 [2024-07-21 11:30:07.346224] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000679f 00:06:38.093 [2024-07-21 11:30:07.346611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:9f9f83e2 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.346640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.093 #31 NEW cov: 11820 ft: 14685 corp: 23/395b lim: 30 exec/s: 31 rss: 70Mb L: 10/30 MS: 1 ShuffleBytes- 00:06:38.093 [2024-07-21 11:30:07.386556] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000699f 00:06:38.093 [2024-07-21 11:30:07.386729] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.386869] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.387012] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.387352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.387381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.093 [2024-07-21 11:30:07.387491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f819f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.387510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.093 [2024-07-21 11:30:07.387622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.387641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.093 [2024-07-21 11:30:07.387769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.387788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.093 #32 NEW cov: 11820 ft: 14720 corp: 24/419b lim: 30 exec/s: 32 rss: 70Mb L: 24/30 MS: 1 ChangeByte- 00:06:38.093 [2024-07-21 11:30:07.436836] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.093 [2024-07-21 11:30:07.436993] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.437161] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.437299] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.437464] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.093 [2024-07-21 11:30:07.437806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.093 [2024-07-21 11:30:07.437834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.093 [2024-07-21 11:30:07.437947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f819f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.437967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.094 [2024-07-21 11:30:07.438079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.438095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.094 [2024-07-21 11:30:07.438206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.438227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.094 [2024-07-21 11:30:07.438351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.438370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:38.094 #33 NEW cov: 11820 ft: 14723 corp: 25/449b lim: 30 exec/s: 33 rss: 70Mb L: 30/30 MS: 1 CopyPart- 00:06:38.094 [2024-07-21 11:30:07.476801] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f0a 00:06:38.094 [2024-07-21 11:30:07.476976] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f62 00:06:38.094 [2024-07-21 11:30:07.477126] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.094 [2024-07-21 11:30:07.477491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.477522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.094 [2024-07-21 11:30:07.477641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.477658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.094 [2024-07-21 11:30:07.477784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:609f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.094 [2024-07-21 11:30:07.477802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.094 #34 NEW cov: 11820 ft: 14758 corp: 26/467b lim: 30 exec/s: 34 rss: 70Mb L: 18/30 MS: 1 CrossOver- 00:06:38.353 [2024-07-21 11:30:07.517062] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:06:38.353 [2024-07-21 11:30:07.517354] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (830468) > buf size (4096) 00:06:38.353 [2024-07-21 11:30:07.517810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff0006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.517840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.353 [2024-07-21 11:30:07.517959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.517977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.353 [2024-07-21 11:30:07.518099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2b00830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.518120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.353 [2024-07-21 11:30:07.518245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.518266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.353 #35 NEW cov: 11820 ft: 14874 corp: 27/495b lim: 30 exec/s: 35 rss: 70Mb L: 28/30 MS: 1 CopyPart- 00:06:38.353 [2024-07-21 11:30:07.566989] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.353 [2024-07-21 11:30:07.567138] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.353 [2024-07-21 11:30:07.567458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.567487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.353 [2024-07-21 11:30:07.567612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.567630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.353 #36 NEW cov: 11820 ft: 14880 corp: 28/512b lim: 30 exec/s: 36 rss: 70Mb L: 17/30 MS: 1 CopyPart- 00:06:38.353 [2024-07-21 11:30:07.607249] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:06:38.353 [2024-07-21 11:30:07.607563] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:06:38.353 [2024-07-21 11:30:07.608032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff0006 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.353 [2024-07-21 11:30:07.608061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.353 [2024-07-21 11:30:07.608179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.608198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.608318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.608338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.608449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.608468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.354 #37 NEW cov: 11820 ft: 14941 corp: 29/540b lim: 30 exec/s: 37 rss: 70Mb L: 28/30 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:06:38.354 [2024-07-21 11:30:07.647375] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:38.354 [2024-07-21 11:30:07.647535] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.647688] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.648028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.648059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.648180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.648204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.648314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:fffc83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.648334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.354 #38 NEW cov: 11820 ft: 14966 corp: 30/560b lim: 30 exec/s: 38 rss: 70Mb L: 20/30 MS: 1 ChangeBinInt- 00:06:38.354 [2024-07-21 11:30:07.687339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:38.354 [2024-07-21 11:30:07.687522] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.687832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.687864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.687980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.687998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.354 #39 NEW cov: 11820 ft: 14976 corp: 31/576b lim: 30 exec/s: 39 rss: 70Mb L: 16/30 MS: 1 EraseBytes- 00:06:38.354 [2024-07-21 11:30:07.737755] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.737904] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.738053] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.738201] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.738352] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.354 [2024-07-21 11:30:07.738695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.738726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.738844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.738864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.738979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.738999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.739118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.739137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.354 [2024-07-21 11:30:07.739252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.354 [2024-07-21 11:30:07.739268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:38.354 #40 NEW cov: 11820 ft: 14989 corp: 32/606b lim: 30 exec/s: 40 rss: 70Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:06:38.614 [2024-07-21 11:30:07.777588] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10268) > buf size (4096) 00:06:38.614 [2024-07-21 11:30:07.777900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.777931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.614 #41 NEW cov: 11820 ft: 14990 corp: 33/615b lim: 30 exec/s: 41 rss: 70Mb L: 9/30 MS: 1 PersAutoDict- DE: "\006\000\000\000\000\000\000\000"- 00:06:38.614 [2024-07-21 11:30:07.817668] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10268) > buf size (4096) 00:06:38.614 [2024-07-21 11:30:07.818029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.818058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.614 #42 NEW cov: 11820 ft: 15003 corp: 34/624b lim: 30 exec/s: 42 rss: 70Mb L: 9/30 MS: 1 PersAutoDict- DE: "\006\000\000\000\000\000\000\000"- 00:06:38.614 [2024-07-21 11:30:07.857942] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.614 [2024-07-21 11:30:07.858106] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.614 [2024-07-21 11:30:07.858256] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.614 [2024-07-21 11:30:07.858583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:249f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.858612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.858731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.858747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.858856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.858874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.614 #43 NEW cov: 11820 ft: 15050 corp: 35/647b lim: 30 exec/s: 43 rss: 70Mb L: 23/30 MS: 1 ChangeByte- 00:06:38.614 [2024-07-21 11:30:07.898198] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (797696) > buf size (4096) 00:06:38.614 [2024-07-21 11:30:07.898365] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (152148) > buf size (4096) 00:06:38.614 [2024-07-21 11:30:07.898532] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.898682] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.899016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.899048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.899173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:94940094 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.899192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.899307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9494837f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.899324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.899447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff832a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.899465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.614 #44 NEW cov: 11820 ft: 15088 corp: 36/676b lim: 30 exec/s: 44 rss: 70Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:06:38.614 [2024-07-21 11:30:07.948419] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.948595] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.948736] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.948881] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.949034] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.614 [2024-07-21 11:30:07.949352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.949380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.949495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.949513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.949626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.949645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.949763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ff4a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.949780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.949892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.949909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:38.614 #45 NEW cov: 11820 ft: 15154 corp: 37/706b lim: 30 exec/s: 45 rss: 70Mb L: 30/30 MS: 1 ChangeBit- 00:06:38.614 [2024-07-21 11:30:07.998514] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.614 [2024-07-21 11:30:07.998666] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.614 [2024-07-21 11:30:07.998811] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.614 [2024-07-21 11:30:07.998955] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001111 00:06:38.614 [2024-07-21 11:30:07.999273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.999303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.999419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f819f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.999437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.999555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.999574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.614 [2024-07-21 11:30:07.999696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:11118111 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.614 [2024-07-21 11:30:07.999712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.614 #46 NEW cov: 11820 ft: 15166 corp: 38/730b lim: 30 exec/s: 46 rss: 70Mb L: 24/30 MS: 1 ShuffleBytes- 00:06:38.875 [2024-07-21 11:30:08.038365] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10268) > buf size (4096) 00:06:38.875 [2024-07-21 11:30:08.038693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.038722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.875 #47 NEW cov: 11820 ft: 15179 corp: 39/739b lim: 30 exec/s: 47 rss: 70Mb L: 9/30 MS: 1 ChangeBit- 00:06:38.875 [2024-07-21 11:30:08.078491] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x9f9f 00:06:38.875 [2024-07-21 11:30:08.078827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9f0062 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.078858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.875 #48 NEW cov: 11820 ft: 15195 corp: 40/748b lim: 30 exec/s: 48 rss: 70Mb L: 9/30 MS: 1 ChangeBit- 00:06:38.875 [2024-07-21 11:30:08.118799] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.875 [2024-07-21 11:30:08.118948] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.875 [2024-07-21 11:30:08.119098] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009f9f 00:06:38.875 [2024-07-21 11:30:08.119413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:249f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.119445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.875 [2024-07-21 11:30:08.119568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.119585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.875 [2024-07-21 11:30:08.119709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9f9f839f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.119726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.875 #49 NEW cov: 11820 ft: 15232 corp: 41/771b lim: 30 exec/s: 49 rss: 70Mb L: 23/30 MS: 1 CopyPart- 00:06:38.875 [2024-07-21 11:30:08.158898] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:38.875 [2024-07-21 11:30:08.159081] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.875 [2024-07-21 11:30:08.159227] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:38.875 [2024-07-21 11:30:08.159562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.159590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.875 [2024-07-21 11:30:08.159711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.159729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.875 [2024-07-21 11:30:08.159846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:fffc83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.875 [2024-07-21 11:30:08.159865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.875 #50 NEW cov: 11820 ft: 15235 corp: 42/791b lim: 30 exec/s: 25 rss: 70Mb L: 20/30 MS: 1 ChangeBit- 00:06:38.875 #50 DONE cov: 11820 ft: 15235 corp: 42/791b lim: 30 exec/s: 25 rss: 70Mb 00:06:38.875 ###### Recommended dictionary. ###### 00:06:38.875 "\006\000\000\000\000\000\000\000" # Uses: 4 00:06:38.875 "\001\000\000\000\000\000\000\000" # Uses: 0 00:06:38.875 ###### End of recommended dictionary. ###### 00:06:38.875 Done 50 runs in 2 second(s) 00:06:38.875 11:30:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:06:39.134 11:30:08 -- ../common.sh@72 -- # (( i++ )) 00:06:39.134 11:30:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:39.134 11:30:08 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:06:39.134 11:30:08 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:06:39.134 11:30:08 -- nvmf/run.sh@24 -- # local timen=1 00:06:39.134 11:30:08 -- nvmf/run.sh@25 -- # local core=0x1 00:06:39.134 11:30:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:39.134 11:30:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:06:39.134 11:30:08 -- nvmf/run.sh@29 -- # printf %02d 2 00:06:39.134 11:30:08 -- nvmf/run.sh@29 -- # port=4402 00:06:39.134 11:30:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:39.134 11:30:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:06:39.134 11:30:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:39.134 11:30:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:06:39.134 [2024-07-21 11:30:08.345261] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:39.134 [2024-07-21 11:30:08.345348] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2060088 ] 00:06:39.134 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.394 [2024-07-21 11:30:08.604125] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.394 [2024-07-21 11:30:08.634103] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:39.394 [2024-07-21 11:30:08.634248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.394 [2024-07-21 11:30:08.686311] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:39.394 [2024-07-21 11:30:08.702669] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:06:39.394 INFO: Running with entropic power schedule (0xFF, 100). 00:06:39.394 INFO: Seed: 2153843715 00:06:39.394 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:39.394 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:39.394 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:39.394 INFO: A corpus is not provided, starting from an empty corpus 00:06:39.394 #2 INITED exec/s: 0 rss: 61Mb 00:06:39.394 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:39.394 This may also happen if the target rejected all inputs we tried so far 00:06:39.394 [2024-07-21 11:30:08.780299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.394 [2024-07-21 11:30:08.780337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.394 [2024-07-21 11:30:08.780408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.394 [2024-07-21 11:30:08.780423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.394 [2024-07-21 11:30:08.780504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.394 [2024-07-21 11:30:08.780518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.394 [2024-07-21 11:30:08.780589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.394 [2024-07-21 11:30:08.780604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:39.962 NEW_FUNC[1/670]: 0x4a1a20 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:06:39.962 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:39.962 #13 NEW cov: 11511 ft: 11512 corp: 2/34b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:06:39.962 [2024-07-21 11:30:09.110356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.110396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.110518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.110536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.110666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.110689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.110812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.110830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:39.962 #14 NEW cov: 11624 ft: 12075 corp: 3/67b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:06:39.962 [2024-07-21 11:30:09.160255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.160287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.160409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.160427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.160550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.160567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.160688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.160706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:39.962 #15 NEW cov: 11630 ft: 12419 corp: 4/100b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ShuffleBytes- 00:06:39.962 [2024-07-21 11:30:09.210170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.210198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.210327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.210343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.210461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.210479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.962 #17 NEW cov: 11715 ft: 13208 corp: 5/124b lim: 35 exec/s: 0 rss: 68Mb L: 24/33 MS: 2 ChangeBit-CrossOver- 00:06:39.962 [2024-07-21 11:30:09.250469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.250496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.250615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c00ce0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.250632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.250761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.250780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.250859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.250878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:39.962 #18 NEW cov: 11715 ft: 13366 corp: 6/158b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:06:39.962 [2024-07-21 11:30:09.290406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.290436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.290568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:f3000cf6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.290585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.290708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.290726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.962 #19 NEW cov: 11715 ft: 13500 corp: 7/182b lim: 35 exec/s: 0 rss: 69Mb L: 24/34 MS: 1 ChangeBinInt- 00:06:39.962 [2024-07-21 11:30:09.340732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.340761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.340883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.340901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.341012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c00210c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.341030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.962 [2024-07-21 11:30:09.341157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.962 [2024-07-21 11:30:09.341173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:39.962 #20 NEW cov: 11715 ft: 13541 corp: 8/215b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 ChangeByte- 00:06:40.222 [2024-07-21 11:30:09.390941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3008c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.390970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.391099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.391119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.391238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.391256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.391372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.391392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.222 #21 NEW cov: 11715 ft: 13562 corp: 9/248b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 ChangeBit- 00:06:40.222 [2024-07-21 11:30:09.431091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.431120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.431236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.431251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.431368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c002100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.431384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.431511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.431530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.222 #22 NEW cov: 11715 ft: 13664 corp: 10/282b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertByte- 00:06:40.222 [2024-07-21 11:30:09.481209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.481236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.481360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.481379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.481502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.481520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.481656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.481673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.222 #23 NEW cov: 11715 ft: 13757 corp: 11/315b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 ChangeBinInt- 00:06:40.222 [2024-07-21 11:30:09.521145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.521174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.521298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c0026 cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.521315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.521436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.521457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.222 #24 NEW cov: 11715 ft: 13761 corp: 12/340b lim: 35 exec/s: 0 rss: 69Mb L: 25/34 MS: 1 InsertByte- 00:06:40.222 [2024-07-21 11:30:09.561407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.561436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.561565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.561583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.561706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.561723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.561843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.561865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.222 #25 NEW cov: 11715 ft: 13774 corp: 13/373b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 ShuffleBytes- 00:06:40.222 [2024-07-21 11:30:09.601555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.601581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.601704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.601721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.601837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.601856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.222 [2024-07-21 11:30:09.601976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.601995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.222 #26 NEW cov: 11715 ft: 13798 corp: 14/407b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:06:40.222 [2024-07-21 11:30:09.641661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.222 [2024-07-21 11:30:09.641689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.223 [2024-07-21 11:30:09.641810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.223 [2024-07-21 11:30:09.641827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.223 [2024-07-21 11:30:09.641952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.223 [2024-07-21 11:30:09.641969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.223 [2024-07-21 11:30:09.642095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.223 [2024-07-21 11:30:09.642111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:40.499 #27 NEW cov: 11738 ft: 13841 corp: 15/440b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 CopyPart- 00:06:40.499 [2024-07-21 11:30:09.691733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.691759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.691881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.691897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.692018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.692036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.692153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.692171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 #28 NEW cov: 11738 ft: 13875 corp: 16/473b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 ChangeBit- 00:06:40.499 [2024-07-21 11:30:09.732004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.732031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.732148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.732165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.732281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.732298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.732418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.732434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 #29 NEW cov: 11738 ft: 13955 corp: 17/507b lim: 35 exec/s: 29 rss: 69Mb L: 34/34 MS: 1 InsertByte- 00:06:40.499 [2024-07-21 11:30:09.772118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f426000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.772146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.772268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.772288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.772409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.772427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.772557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.772573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 #30 NEW cov: 11738 ft: 14018 corp: 18/541b lim: 35 exec/s: 30 rss: 69Mb L: 34/34 MS: 1 ChangeByte- 00:06:40.499 [2024-07-21 11:30:09.812306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.812335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.812415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.812434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.812563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:2b00210c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.812578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.812699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.812716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 #31 NEW cov: 11738 ft: 14031 corp: 19/574b lim: 35 exec/s: 31 rss: 69Mb L: 33/34 MS: 1 ChangeByte- 00:06:40.499 [2024-07-21 11:30:09.852305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3008c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.852334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.852450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.852467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.852572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.852588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.852708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.852725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 #32 NEW cov: 11738 ft: 14044 corp: 20/607b lim: 35 exec/s: 32 rss: 70Mb L: 33/34 MS: 1 ShuffleBytes- 00:06:40.499 [2024-07-21 11:30:09.892515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.892545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.892667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.892686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.892776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.892792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.499 [2024-07-21 11:30:09.892913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.499 [2024-07-21 11:30:09.892929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.499 #33 NEW cov: 11738 ft: 14097 corp: 21/640b lim: 35 exec/s: 33 rss: 70Mb L: 33/34 MS: 1 CrossOver- 00:06:40.759 [2024-07-21 11:30:09.932638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.932669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.932790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.932809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.932875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:2b00210c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.932892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.933005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.933022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.759 #34 NEW cov: 11738 ft: 14114 corp: 22/673b lim: 35 exec/s: 34 rss: 70Mb L: 33/34 MS: 1 ShuffleBytes- 00:06:40.759 [2024-07-21 11:30:09.972960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.972989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.973108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.973126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.973239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0000213f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.973254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.973374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.973390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:09.973513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:09.973529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:40.759 #35 NEW cov: 11738 ft: 14183 corp: 23/708b lim: 35 exec/s: 35 rss: 70Mb L: 35/35 MS: 1 InsertByte- 00:06:40.759 [2024-07-21 11:30:10.012709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.012738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.012854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:21000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.012873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.012988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c002b cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.013009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.013128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.013146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.759 #36 NEW cov: 11738 ft: 14197 corp: 24/736b lim: 35 exec/s: 36 rss: 70Mb L: 28/35 MS: 1 EraseBytes- 00:06:40.759 [2024-07-21 11:30:10.053031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.053058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.053190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.053208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.053324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.053342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.053462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.053486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.759 #37 NEW cov: 11738 ft: 14200 corp: 25/769b lim: 35 exec/s: 37 rss: 70Mb L: 33/35 MS: 1 CrossOver- 00:06:40.759 [2024-07-21 11:30:10.093074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.093101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.093242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.093260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.093382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c00210c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.093399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.093528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000f0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.093545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:40.759 #38 NEW cov: 11738 ft: 14204 corp: 26/802b lim: 35 exec/s: 38 rss: 70Mb L: 33/35 MS: 1 ChangeBinInt- 00:06:40.759 [2024-07-21 11:30:10.132970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3001a cdw11:0c00f3f3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.132999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.133117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:f3000cf6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.133134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.133260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.133277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.759 #39 NEW cov: 11747 ft: 14260 corp: 27/826b lim: 35 exec/s: 39 rss: 70Mb L: 24/35 MS: 1 ChangeBinInt- 00:06:40.759 [2024-07-21 11:30:10.173354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.173382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.173497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.759 [2024-07-21 11:30:10.173515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.759 [2024-07-21 11:30:10.173625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:29000caf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.760 [2024-07-21 11:30:10.173642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:40.760 [2024-07-21 11:30:10.173758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:698f00b5 cdw11:0c002f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.760 [2024-07-21 11:30:10.173776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.018 #40 NEW cov: 11747 ft: 14274 corp: 28/859b lim: 35 exec/s: 40 rss: 70Mb L: 33/35 MS: 1 CMP- DE: "\257)\327\265i\217/\000"- 00:06:41.018 [2024-07-21 11:30:10.213636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.213662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.018 [2024-07-21 11:30:10.213779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.213797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.018 [2024-07-21 11:30:10.213922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.213939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.018 [2024-07-21 11:30:10.214063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.214079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.018 [2024-07-21 11:30:10.214196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.214213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:41.018 #41 NEW cov: 11747 ft: 14280 corp: 29/894b lim: 35 exec/s: 41 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:06:41.018 [2024-07-21 11:30:10.253368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.253395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.018 [2024-07-21 11:30:10.253514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c0026 cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.253532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.018 [2024-07-21 11:30:10.253652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:d700af29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.018 [2024-07-21 11:30:10.253669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.019 #42 NEW cov: 11747 ft: 14306 corp: 30/919b lim: 35 exec/s: 42 rss: 70Mb L: 25/35 MS: 1 PersAutoDict- DE: "\257)\327\265i\217/\000"- 00:06:41.019 [2024-07-21 11:30:10.293509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.293536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.293668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0cf3000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.293683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.293809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.293824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.019 #43 NEW cov: 11747 ft: 14356 corp: 31/943b lim: 35 exec/s: 43 rss: 70Mb L: 24/35 MS: 1 ShuffleBytes- 00:06:41.019 [2024-07-21 11:30:10.333812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.333840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.333966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.333982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.334100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.334115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.334234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.334250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.019 #44 NEW cov: 11747 ft: 14360 corp: 32/977b lim: 35 exec/s: 44 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:06:41.019 [2024-07-21 11:30:10.373555] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:41.019 [2024-07-21 11:30:10.374039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.374068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.374191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.374210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.374328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c0000 cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.374349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.374466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.374485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.019 #45 NEW cov: 11756 ft: 14426 corp: 33/1009b lim: 35 exec/s: 45 rss: 70Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:06:41.019 [2024-07-21 11:30:10.413748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.413777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.413900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0cf3000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.413917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.019 [2024-07-21 11:30:10.414041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.019 [2024-07-21 11:30:10.414058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.019 #46 NEW cov: 11756 ft: 14435 corp: 34/1033b lim: 35 exec/s: 46 rss: 70Mb L: 24/35 MS: 1 ShuffleBytes- 00:06:41.278 [2024-07-21 11:30:10.454109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.454135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.454266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c002c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.454286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.454402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.454434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.454552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.454567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.278 #47 NEW cov: 11756 ft: 14462 corp: 35/1067b lim: 35 exec/s: 47 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:06:41.278 [2024-07-21 11:30:10.504109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3001a cdw11:f300f30d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.504137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.504259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f3f300f3 cdw11:f300f301 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.504278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.504403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.504420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.278 #48 NEW cov: 11756 ft: 14470 corp: 36/1091b lim: 35 exec/s: 48 rss: 70Mb L: 24/35 MS: 1 ChangeBinInt- 00:06:41.278 [2024-07-21 11:30:10.553857] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:41.278 [2024-07-21 11:30:10.554205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.554235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.554363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0cf3000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.554382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.554512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.554533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.278 #49 NEW cov: 11756 ft: 14489 corp: 37/1115b lim: 35 exec/s: 49 rss: 70Mb L: 24/35 MS: 1 ChangeBinInt- 00:06:41.278 [2024-07-21 11:30:10.594545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3008c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.594575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.594698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.594717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.594844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.594863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.594983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.595001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.278 #50 NEW cov: 11756 ft: 14501 corp: 38/1148b lim: 35 exec/s: 50 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:06:41.278 [2024-07-21 11:30:10.644480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0c0c001a cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.644509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.644625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0cf3000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.644643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.644754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c4c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.644773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.278 #51 NEW cov: 11756 ft: 14502 corp: 39/1172b lim: 35 exec/s: 51 rss: 70Mb L: 24/35 MS: 1 ChangeBit- 00:06:41.278 [2024-07-21 11:30:10.684832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.684861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.684986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.685005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.685127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0a000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.685145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.278 [2024-07-21 11:30:10.685268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.278 [2024-07-21 11:30:10.685286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.538 #52 NEW cov: 11756 ft: 14506 corp: 40/1205b lim: 35 exec/s: 52 rss: 70Mb L: 33/35 MS: 1 ChangeBinInt- 00:06:41.538 [2024-07-21 11:30:10.724916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.724945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.538 [2024-07-21 11:30:10.725076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c0004 cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.725095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.538 [2024-07-21 11:30:10.725220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.725238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.538 [2024-07-21 11:30:10.725372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.725390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.538 #53 NEW cov: 11756 ft: 14519 corp: 41/1239b lim: 35 exec/s: 53 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:06:41.538 [2024-07-21 11:30:10.765062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f4f3000c cdw11:0c00f3f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.765090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.538 [2024-07-21 11:30:10.765207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.765226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.538 [2024-07-21 11:30:10.765345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0c0c000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.765366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.538 [2024-07-21 11:30:10.765483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0c08000c cdw11:0c000c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.538 [2024-07-21 11:30:10.765501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.538 #54 NEW cov: 11756 ft: 14579 corp: 42/1272b lim: 35 exec/s: 27 rss: 70Mb L: 33/35 MS: 1 ChangeBit- 00:06:41.538 #54 DONE cov: 11756 ft: 14579 corp: 42/1272b lim: 35 exec/s: 27 rss: 70Mb 00:06:41.538 ###### Recommended dictionary. ###### 00:06:41.538 "\257)\327\265i\217/\000" # Uses: 1 00:06:41.538 ###### End of recommended dictionary. ###### 00:06:41.538 Done 54 runs in 2 second(s) 00:06:41.538 11:30:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:06:41.538 11:30:10 -- ../common.sh@72 -- # (( i++ )) 00:06:41.538 11:30:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:41.538 11:30:10 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:06:41.538 11:30:10 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:06:41.538 11:30:10 -- nvmf/run.sh@24 -- # local timen=1 00:06:41.538 11:30:10 -- nvmf/run.sh@25 -- # local core=0x1 00:06:41.538 11:30:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:41.538 11:30:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:06:41.538 11:30:10 -- nvmf/run.sh@29 -- # printf %02d 3 00:06:41.538 11:30:10 -- nvmf/run.sh@29 -- # port=4403 00:06:41.538 11:30:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:41.538 11:30:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:06:41.538 11:30:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:41.538 11:30:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:06:41.538 [2024-07-21 11:30:10.944799] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:41.538 [2024-07-21 11:30:10.944864] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2060592 ] 00:06:41.797 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.797 [2024-07-21 11:30:11.194179] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.057 [2024-07-21 11:30:11.224029] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:42.057 [2024-07-21 11:30:11.224162] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.057 [2024-07-21 11:30:11.275946] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:42.057 [2024-07-21 11:30:11.292272] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:06:42.057 INFO: Running with entropic power schedule (0xFF, 100). 00:06:42.057 INFO: Seed: 447870483 00:06:42.057 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:42.057 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:42.057 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:42.057 INFO: A corpus is not provided, starting from an empty corpus 00:06:42.057 #2 INITED exec/s: 0 rss: 60Mb 00:06:42.057 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:42.057 This may also happen if the target rejected all inputs we tried so far 00:06:42.316 NEW_FUNC[1/659]: 0x4a36f0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:06:42.316 NEW_FUNC[2/659]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:42.316 #14 NEW cov: 11426 ft: 11427 corp: 2/15b lim: 20 exec/s: 0 rss: 67Mb L: 14/14 MS: 2 ChangeBit-InsertRepeatedBytes- 00:06:42.316 #15 NEW cov: 11539 ft: 12013 corp: 3/30b lim: 20 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 InsertByte- 00:06:42.316 #16 NEW cov: 11562 ft: 12450 corp: 4/46b lim: 20 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertByte- 00:06:42.574 #17 NEW cov: 11647 ft: 12719 corp: 5/62b lim: 20 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 ChangeByte- 00:06:42.574 #18 NEW cov: 11648 ft: 13068 corp: 6/70b lim: 20 exec/s: 0 rss: 68Mb L: 8/16 MS: 1 InsertRepeatedBytes- 00:06:42.574 #19 NEW cov: 11648 ft: 13148 corp: 7/82b lim: 20 exec/s: 0 rss: 68Mb L: 12/16 MS: 1 InsertRepeatedBytes- 00:06:42.574 #20 NEW cov: 11648 ft: 13248 corp: 8/99b lim: 20 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:06:42.574 #21 NEW cov: 11648 ft: 13552 corp: 9/106b lim: 20 exec/s: 0 rss: 68Mb L: 7/17 MS: 1 CrossOver- 00:06:42.574 #22 NEW cov: 11648 ft: 13573 corp: 10/124b lim: 20 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:06:42.833 #23 NEW cov: 11648 ft: 13647 corp: 11/140b lim: 20 exec/s: 0 rss: 69Mb L: 16/18 MS: 1 CopyPart- 00:06:42.833 #24 NEW cov: 11648 ft: 13672 corp: 12/153b lim: 20 exec/s: 0 rss: 70Mb L: 13/18 MS: 1 CrossOver- 00:06:42.833 #25 NEW cov: 11648 ft: 13722 corp: 13/171b lim: 20 exec/s: 0 rss: 70Mb L: 18/18 MS: 1 CopyPart- 00:06:42.833 #26 NEW cov: 11648 ft: 13727 corp: 14/188b lim: 20 exec/s: 0 rss: 70Mb L: 17/18 MS: 1 CrossOver- 00:06:42.833 #27 NEW cov: 11648 ft: 13797 corp: 15/204b lim: 20 exec/s: 0 rss: 70Mb L: 16/18 MS: 1 ChangeBinInt- 00:06:42.833 #28 NEW cov: 11648 ft: 13841 corp: 16/224b lim: 20 exec/s: 0 rss: 70Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:06:42.833 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:42.833 #32 NEW cov: 11671 ft: 13878 corp: 17/239b lim: 20 exec/s: 0 rss: 70Mb L: 15/20 MS: 4 ChangeBit-CopyPart-EraseBytes-InsertRepeatedBytes- 00:06:43.091 #33 NEW cov: 11671 ft: 13942 corp: 18/256b lim: 20 exec/s: 0 rss: 70Mb L: 17/20 MS: 1 ChangeBit- 00:06:43.091 #34 NEW cov: 11671 ft: 13969 corp: 19/274b lim: 20 exec/s: 34 rss: 70Mb L: 18/20 MS: 1 ChangeBinInt- 00:06:43.091 #35 NEW cov: 11671 ft: 13982 corp: 20/278b lim: 20 exec/s: 35 rss: 70Mb L: 4/20 MS: 1 EraseBytes- 00:06:43.091 #36 NEW cov: 11671 ft: 14011 corp: 21/294b lim: 20 exec/s: 36 rss: 70Mb L: 16/20 MS: 1 InsertByte- 00:06:43.091 [2024-07-21 11:30:12.420839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:43.091 [2024-07-21 11:30:12.420880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.091 NEW_FUNC[1/17]: 0x115bea0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:06:43.091 NEW_FUNC[2/17]: 0x115ca20 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:06:43.091 #37 NEW cov: 11913 ft: 14312 corp: 22/314b lim: 20 exec/s: 37 rss: 70Mb L: 20/20 MS: 1 CMP- DE: "\177\000\000\000"- 00:06:43.091 #38 NEW cov: 11913 ft: 14314 corp: 23/329b lim: 20 exec/s: 38 rss: 70Mb L: 15/20 MS: 1 InsertByte- 00:06:43.349 NEW_FUNC[1/2]: 0x12b30a0 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:773 00:06:43.349 NEW_FUNC[2/2]: 0x12d3a80 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3491 00:06:43.349 #39 NEW cov: 11969 ft: 14400 corp: 24/346b lim: 20 exec/s: 39 rss: 70Mb L: 17/20 MS: 1 CMP- DE: "\002\000\000\000"- 00:06:43.349 [2024-07-21 11:30:12.541282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:43.349 [2024-07-21 11:30:12.541314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.349 #40 NEW cov: 11969 ft: 14456 corp: 25/366b lim: 20 exec/s: 40 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:06:43.349 #41 NEW cov: 11969 ft: 14558 corp: 26/386b lim: 20 exec/s: 41 rss: 70Mb L: 20/20 MS: 1 ChangeByte- 00:06:43.349 #42 NEW cov: 11969 ft: 14564 corp: 27/402b lim: 20 exec/s: 42 rss: 70Mb L: 16/20 MS: 1 CrossOver- 00:06:43.349 #43 NEW cov: 11969 ft: 14608 corp: 28/418b lim: 20 exec/s: 43 rss: 70Mb L: 16/20 MS: 1 ShuffleBytes- 00:06:43.349 #44 NEW cov: 11969 ft: 14646 corp: 29/427b lim: 20 exec/s: 44 rss: 70Mb L: 9/20 MS: 1 InsertRepeatedBytes- 00:06:43.349 #45 NEW cov: 11969 ft: 14652 corp: 30/445b lim: 20 exec/s: 45 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:06:43.607 #46 NEW cov: 11969 ft: 14661 corp: 31/463b lim: 20 exec/s: 46 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:06:43.607 #47 NEW cov: 11969 ft: 14669 corp: 32/473b lim: 20 exec/s: 47 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:06:43.607 #48 NEW cov: 11969 ft: 14720 corp: 33/481b lim: 20 exec/s: 48 rss: 70Mb L: 8/20 MS: 1 PersAutoDict- DE: "\177\000\000\000"- 00:06:43.607 #49 NEW cov: 11969 ft: 14729 corp: 34/496b lim: 20 exec/s: 49 rss: 70Mb L: 15/20 MS: 1 ShuffleBytes- 00:06:43.607 [2024-07-21 11:30:12.952463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:43.607 [2024-07-21 11:30:12.952507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.607 #50 NEW cov: 11969 ft: 14740 corp: 35/516b lim: 20 exec/s: 50 rss: 70Mb L: 20/20 MS: 1 ChangeByte- 00:06:43.607 #51 NEW cov: 11969 ft: 14784 corp: 36/532b lim: 20 exec/s: 51 rss: 70Mb L: 16/20 MS: 1 ChangeByte- 00:06:43.864 #52 NEW cov: 11969 ft: 14799 corp: 37/550b lim: 20 exec/s: 52 rss: 71Mb L: 18/20 MS: 1 ChangeByte- 00:06:43.864 #53 NEW cov: 11969 ft: 14814 corp: 38/570b lim: 20 exec/s: 53 rss: 71Mb L: 20/20 MS: 1 CopyPart- 00:06:43.864 #54 NEW cov: 11969 ft: 14841 corp: 39/588b lim: 20 exec/s: 54 rss: 71Mb L: 18/20 MS: 1 CopyPart- 00:06:43.864 #55 NEW cov: 11969 ft: 14908 corp: 40/602b lim: 20 exec/s: 55 rss: 71Mb L: 14/20 MS: 1 EraseBytes- 00:06:43.864 [2024-07-21 11:30:13.193054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:43.864 [2024-07-21 11:30:13.193080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:43.864 #56 NEW cov: 11969 ft: 14917 corp: 41/620b lim: 20 exec/s: 56 rss: 71Mb L: 18/20 MS: 1 EraseBytes- 00:06:43.864 #57 NEW cov: 11969 ft: 14927 corp: 42/638b lim: 20 exec/s: 57 rss: 71Mb L: 18/20 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:06:44.123 #58 NEW cov: 11969 ft: 14972 corp: 43/652b lim: 20 exec/s: 58 rss: 71Mb L: 14/20 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:06:44.123 #59 NEW cov: 11969 ft: 14974 corp: 44/670b lim: 20 exec/s: 29 rss: 71Mb L: 18/20 MS: 1 CMP- DE: "\001\000\000\000"- 00:06:44.123 #59 DONE cov: 11969 ft: 14974 corp: 44/670b lim: 20 exec/s: 29 rss: 71Mb 00:06:44.123 ###### Recommended dictionary. ###### 00:06:44.123 "\177\000\000\000" # Uses: 1 00:06:44.123 "\002\000\000\000" # Uses: 2 00:06:44.123 "\001\000\000\000" # Uses: 0 00:06:44.123 ###### End of recommended dictionary. ###### 00:06:44.123 Done 59 runs in 2 second(s) 00:06:44.123 11:30:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:06:44.123 11:30:13 -- ../common.sh@72 -- # (( i++ )) 00:06:44.123 11:30:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:44.123 11:30:13 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:06:44.123 11:30:13 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:06:44.123 11:30:13 -- nvmf/run.sh@24 -- # local timen=1 00:06:44.123 11:30:13 -- nvmf/run.sh@25 -- # local core=0x1 00:06:44.123 11:30:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:44.123 11:30:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:06:44.123 11:30:13 -- nvmf/run.sh@29 -- # printf %02d 4 00:06:44.123 11:30:13 -- nvmf/run.sh@29 -- # port=4404 00:06:44.123 11:30:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:44.123 11:30:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:06:44.123 11:30:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:44.123 11:30:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:06:44.123 [2024-07-21 11:30:13.496765] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:44.123 [2024-07-21 11:30:13.496844] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2060922 ] 00:06:44.123 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.444 [2024-07-21 11:30:13.755194] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.444 [2024-07-21 11:30:13.784149] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:44.444 [2024-07-21 11:30:13.784291] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.444 [2024-07-21 11:30:13.836081] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:44.444 [2024-07-21 11:30:13.852417] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:06:44.444 INFO: Running with entropic power schedule (0xFF, 100). 00:06:44.444 INFO: Seed: 3007865593 00:06:44.702 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:44.702 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:44.702 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:44.702 INFO: A corpus is not provided, starting from an empty corpus 00:06:44.702 #2 INITED exec/s: 0 rss: 60Mb 00:06:44.702 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:44.702 This may also happen if the target rejected all inputs we tried so far 00:06:44.702 [2024-07-21 11:30:13.929374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:44.702 [2024-07-21 11:30:13.929413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.960 NEW_FUNC[1/671]: 0x4a47e0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:06:44.960 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:44.961 #16 NEW cov: 11532 ft: 11533 corp: 2/8b lim: 35 exec/s: 0 rss: 68Mb L: 7/7 MS: 4 CopyPart-ChangeByte-CMP-CrossOver- DE: "4\000\000\000"- 00:06:44.961 [2024-07-21 11:30:14.259333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff390000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:44.961 [2024-07-21 11:30:14.259375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.961 #17 NEW cov: 11645 ft: 12112 corp: 3/15b lim: 35 exec/s: 0 rss: 68Mb L: 7/7 MS: 1 ChangeByte- 00:06:44.961 [2024-07-21 11:30:14.299363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:44.961 [2024-07-21 11:30:14.299392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.961 #28 NEW cov: 11651 ft: 12464 corp: 4/26b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 PersAutoDict- DE: "4\000\000\000"- 00:06:44.961 [2024-07-21 11:30:14.339500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a0000 cdw11:0a340000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:44.961 [2024-07-21 11:30:14.339528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.961 #29 NEW cov: 11736 ft: 12763 corp: 5/35b lim: 35 exec/s: 0 rss: 68Mb L: 9/11 MS: 1 EraseBytes- 00:06:44.961 [2024-07-21 11:30:14.379847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:44.961 [2024-07-21 11:30:14.379877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.961 [2024-07-21 11:30:14.379995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000a34 cdw11:0a4b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:44.961 [2024-07-21 11:30:14.380012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.219 #39 NEW cov: 11736 ft: 13533 corp: 6/49b lim: 35 exec/s: 0 rss: 68Mb L: 14/14 MS: 5 InsertByte-ShuffleBytes-PersAutoDict-EraseBytes-CrossOver- DE: "4\000\000\000"- 00:06:45.219 [2024-07-21 11:30:14.419737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:d9390000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.219 [2024-07-21 11:30:14.419764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.219 #40 NEW cov: 11736 ft: 13664 corp: 7/56b lim: 35 exec/s: 0 rss: 68Mb L: 7/14 MS: 1 ChangeByte- 00:06:45.219 [2024-07-21 11:30:14.459848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.219 [2024-07-21 11:30:14.459874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.219 #41 NEW cov: 11736 ft: 13738 corp: 8/67b lim: 35 exec/s: 0 rss: 69Mb L: 11/14 MS: 1 ChangeByte- 00:06:45.219 [2024-07-21 11:30:14.499931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000003f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.219 [2024-07-21 11:30:14.499959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.219 #46 NEW cov: 11736 ft: 13794 corp: 9/80b lim: 35 exec/s: 0 rss: 69Mb L: 13/14 MS: 5 ChangeByte-ChangeByte-InsertRepeatedBytes-CrossOver-InsertRepeatedBytes- 00:06:45.219 [2024-07-21 11:30:14.540113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00340000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.219 [2024-07-21 11:30:14.540139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.219 #47 NEW cov: 11736 ft: 13807 corp: 10/90b lim: 35 exec/s: 0 rss: 69Mb L: 10/14 MS: 1 InsertRepeatedBytes- 00:06:45.219 [2024-07-21 11:30:14.580237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff340000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.219 [2024-07-21 11:30:14.580263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.219 #48 NEW cov: 11736 ft: 13871 corp: 11/99b lim: 35 exec/s: 0 rss: 69Mb L: 9/14 MS: 1 EraseBytes- 00:06:45.219 [2024-07-21 11:30:14.620380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.219 [2024-07-21 11:30:14.620408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.219 #49 NEW cov: 11736 ft: 13924 corp: 12/106b lim: 35 exec/s: 0 rss: 69Mb L: 7/14 MS: 1 CrossOver- 00:06:45.477 [2024-07-21 11:30:14.660509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff003400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.477 [2024-07-21 11:30:14.660536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.478 #50 NEW cov: 11736 ft: 14006 corp: 13/115b lim: 35 exec/s: 0 rss: 69Mb L: 9/14 MS: 1 ShuffleBytes- 00:06:45.478 [2024-07-21 11:30:14.700803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c1c13400 cdw11:c1c10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.700833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.478 [2024-07-21 11:30:14.700950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c1c1c1c1 cdw11:c1c10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.700967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.478 #51 NEW cov: 11736 ft: 14024 corp: 14/135b lim: 35 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:06:45.478 [2024-07-21 11:30:14.740950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.740975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.478 [2024-07-21 11:30:14.741087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:34000a0a cdw11:000a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.741106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.478 #52 NEW cov: 11736 ft: 14053 corp: 15/150b lim: 35 exec/s: 0 rss: 69Mb L: 15/20 MS: 1 InsertByte- 00:06:45.478 [2024-07-21 11:30:14.781072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff340000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.781101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.478 [2024-07-21 11:30:14.781216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:34003100 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.781233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.478 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:45.478 #53 NEW cov: 11759 ft: 14109 corp: 16/166b lim: 35 exec/s: 0 rss: 69Mb L: 16/20 MS: 1 CrossOver- 00:06:45.478 [2024-07-21 11:30:14.820966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00203400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.820992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.478 #54 NEW cov: 11759 ft: 14136 corp: 17/173b lim: 35 exec/s: 0 rss: 69Mb L: 7/20 MS: 1 ChangeBit- 00:06:45.478 [2024-07-21 11:30:14.861048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff003400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.861075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.478 #55 NEW cov: 11759 ft: 14154 corp: 18/182b lim: 35 exec/s: 0 rss: 69Mb L: 9/20 MS: 1 ChangeASCIIInt- 00:06:45.478 [2024-07-21 11:30:14.901225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff00c4ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.478 [2024-07-21 11:30:14.901252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.736 #56 NEW cov: 11759 ft: 14227 corp: 19/191b lim: 35 exec/s: 56 rss: 69Mb L: 9/20 MS: 1 ChangeBinInt- 00:06:45.736 [2024-07-21 11:30:14.941488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00343400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:14.941519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.736 [2024-07-21 11:30:14.941633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff0a0000 cdw11:0a340000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:14.941651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.736 #57 NEW cov: 11759 ft: 14243 corp: 20/209b lim: 35 exec/s: 57 rss: 69Mb L: 18/20 MS: 1 PersAutoDict- DE: "4\000\000\000"- 00:06:45.736 [2024-07-21 11:30:14.981352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:f4ffc4ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:14.981378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.736 #58 NEW cov: 11759 ft: 14254 corp: 21/219b lim: 35 exec/s: 58 rss: 69Mb L: 10/20 MS: 1 InsertByte- 00:06:45.736 [2024-07-21 11:30:15.021462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a00ff cdw11:34000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:15.021489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.736 #59 NEW cov: 11759 ft: 14265 corp: 22/228b lim: 35 exec/s: 59 rss: 70Mb L: 9/20 MS: 1 CopyPart- 00:06:45.736 [2024-07-21 11:30:15.061886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fff434c4 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:15.061917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.736 [2024-07-21 11:30:15.062031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000034 cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:15.062048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.736 #60 NEW cov: 11759 ft: 14285 corp: 23/245b lim: 35 exec/s: 60 rss: 70Mb L: 17/20 MS: 1 CrossOver- 00:06:45.736 [2024-07-21 11:30:15.101784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00343400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:15.101811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.736 #61 NEW cov: 11759 ft: 14301 corp: 24/252b lim: 35 exec/s: 61 rss: 70Mb L: 7/20 MS: 1 PersAutoDict- DE: "4\000\000\000"- 00:06:45.736 [2024-07-21 11:30:15.141930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00340000 cdw11:b3000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.736 [2024-07-21 11:30:15.141958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 #62 NEW cov: 11759 ft: 14316 corp: 25/263b lim: 35 exec/s: 62 rss: 70Mb L: 11/20 MS: 1 InsertByte- 00:06:45.994 [2024-07-21 11:30:15.182049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.182076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 #64 NEW cov: 11759 ft: 14337 corp: 26/274b lim: 35 exec/s: 64 rss: 70Mb L: 11/20 MS: 2 EraseBytes-InsertRepeatedBytes- 00:06:45.994 [2024-07-21 11:30:15.222424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c1c1343a cdw11:c1c10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.222458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 [2024-07-21 11:30:15.222578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c1c1c1c1 cdw11:c1c10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.222596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.994 #65 NEW cov: 11759 ft: 14345 corp: 27/294b lim: 35 exec/s: 65 rss: 70Mb L: 20/20 MS: 1 ChangeByte- 00:06:45.994 [2024-07-21 11:30:15.272313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff003400 cdw11:d9390000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.272341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 #66 NEW cov: 11759 ft: 14378 corp: 28/301b lim: 35 exec/s: 66 rss: 70Mb L: 7/20 MS: 1 ChangeByte- 00:06:45.994 [2024-07-21 11:30:15.312409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff003410 cdw11:d9390000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.312437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 #72 NEW cov: 11759 ft: 14389 corp: 29/308b lim: 35 exec/s: 72 rss: 70Mb L: 7/20 MS: 1 ChangeBit- 00:06:45.994 [2024-07-21 11:30:15.352565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00340000 cdw11:b3000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.352593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 #73 NEW cov: 11759 ft: 14418 corp: 30/320b lim: 35 exec/s: 73 rss: 70Mb L: 12/20 MS: 1 InsertByte- 00:06:45.994 [2024-07-21 11:30:15.392737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff340000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.994 [2024-07-21 11:30:15.392764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.994 #74 NEW cov: 11759 ft: 14433 corp: 31/331b lim: 35 exec/s: 74 rss: 70Mb L: 11/20 MS: 1 CrossOver- 00:06:46.250 [2024-07-21 11:30:15.432827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff003400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.250 [2024-07-21 11:30:15.432854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.250 #75 NEW cov: 11759 ft: 14436 corp: 32/340b lim: 35 exec/s: 75 rss: 70Mb L: 9/20 MS: 1 ChangeASCIIInt- 00:06:46.250 [2024-07-21 11:30:15.472906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.250 [2024-07-21 11:30:15.472931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.250 #76 NEW cov: 11759 ft: 14450 corp: 33/351b lim: 35 exec/s: 76 rss: 70Mb L: 11/20 MS: 1 ChangeByte- 00:06:46.251 [2024-07-21 11:30:15.512982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0600c4ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.251 [2024-07-21 11:30:15.513009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.251 #77 NEW cov: 11759 ft: 14476 corp: 34/360b lim: 35 exec/s: 77 rss: 70Mb L: 9/20 MS: 1 ChangeBinInt- 00:06:46.251 [2024-07-21 11:30:15.553410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c1c13400 cdw11:c1c10002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.251 [2024-07-21 11:30:15.553436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.251 [2024-07-21 11:30:15.553550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c1c1c1c1 cdw11:c1c10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.251 [2024-07-21 11:30:15.553565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.251 #78 NEW cov: 11759 ft: 14485 corp: 35/380b lim: 35 exec/s: 78 rss: 70Mb L: 20/20 MS: 1 ChangeBit- 00:06:46.251 [2024-07-21 11:30:15.593208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:ff380000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.251 [2024-07-21 11:30:15.593240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.251 #79 NEW cov: 11759 ft: 14487 corp: 36/389b lim: 35 exec/s: 79 rss: 70Mb L: 9/20 MS: 1 ChangeASCIIInt- 00:06:46.251 [2024-07-21 11:30:15.623304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.251 [2024-07-21 11:30:15.623331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.251 #80 NEW cov: 11759 ft: 14498 corp: 37/401b lim: 35 exec/s: 80 rss: 70Mb L: 12/20 MS: 1 InsertByte- 00:06:46.251 [2024-07-21 11:30:15.663393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff34fc cdw11:fe340000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.251 [2024-07-21 11:30:15.663421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 #81 NEW cov: 11759 ft: 14520 corp: 38/410b lim: 35 exec/s: 81 rss: 70Mb L: 9/20 MS: 1 ChangeBinInt- 00:06:46.509 [2024-07-21 11:30:15.693504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:40003400 cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.693530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 #82 NEW cov: 11759 ft: 14563 corp: 39/417b lim: 35 exec/s: 82 rss: 70Mb L: 7/20 MS: 1 ChangeBit- 00:06:46.509 [2024-07-21 11:30:15.733660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:38ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.733686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 #83 NEW cov: 11759 ft: 14595 corp: 40/426b lim: 35 exec/s: 83 rss: 70Mb L: 9/20 MS: 1 ShuffleBytes- 00:06:46.509 [2024-07-21 11:30:15.773781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000034ac cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.773808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 #84 NEW cov: 11759 ft: 14631 corp: 41/436b lim: 35 exec/s: 84 rss: 70Mb L: 10/20 MS: 1 InsertByte- 00:06:46.509 [2024-07-21 11:30:15.803885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00003400 cdw11:34000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.803910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 #85 NEW cov: 11759 ft: 14635 corp: 42/447b lim: 35 exec/s: 85 rss: 70Mb L: 11/20 MS: 1 PersAutoDict- DE: "4\000\000\000"- 00:06:46.509 [2024-07-21 11:30:15.844290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000003f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.844316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 [2024-07-21 11:30:15.844424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.844440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.509 [2024-07-21 11:30:15.884395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000003f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.884425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.509 [2024-07-21 11:30:15.884555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000034 cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.509 [2024-07-21 11:30:15.884571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.509 #87 NEW cov: 11759 ft: 14641 corp: 43/463b lim: 35 exec/s: 43 rss: 70Mb L: 16/20 MS: 2 InsertRepeatedBytes-CrossOver- 00:06:46.509 #87 DONE cov: 11759 ft: 14641 corp: 43/463b lim: 35 exec/s: 43 rss: 70Mb 00:06:46.509 ###### Recommended dictionary. ###### 00:06:46.509 "4\000\000\000" # Uses: 6 00:06:46.509 ###### End of recommended dictionary. ###### 00:06:46.509 Done 87 runs in 2 second(s) 00:06:46.768 11:30:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:06:46.768 11:30:16 -- ../common.sh@72 -- # (( i++ )) 00:06:46.768 11:30:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:46.768 11:30:16 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:06:46.768 11:30:16 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:06:46.768 11:30:16 -- nvmf/run.sh@24 -- # local timen=1 00:06:46.768 11:30:16 -- nvmf/run.sh@25 -- # local core=0x1 00:06:46.768 11:30:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:46.768 11:30:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:06:46.768 11:30:16 -- nvmf/run.sh@29 -- # printf %02d 5 00:06:46.768 11:30:16 -- nvmf/run.sh@29 -- # port=4405 00:06:46.768 11:30:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:46.768 11:30:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:06:46.768 11:30:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:46.768 11:30:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:06:46.768 [2024-07-21 11:30:16.063583] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:46.768 [2024-07-21 11:30:16.063648] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2061467 ] 00:06:46.768 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.026 [2024-07-21 11:30:16.313932] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.026 [2024-07-21 11:30:16.340904] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:47.026 [2024-07-21 11:30:16.341032] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.026 [2024-07-21 11:30:16.392596] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:47.026 [2024-07-21 11:30:16.408930] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:06:47.026 INFO: Running with entropic power schedule (0xFF, 100). 00:06:47.026 INFO: Seed: 1267918301 00:06:47.026 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:47.026 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:47.026 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:47.026 INFO: A corpus is not provided, starting from an empty corpus 00:06:47.026 #2 INITED exec/s: 0 rss: 60Mb 00:06:47.026 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:47.026 This may also happen if the target rejected all inputs we tried so far 00:06:47.284 [2024-07-21 11:30:16.480453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.284 [2024-07-21 11:30:16.480490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.284 [2024-07-21 11:30:16.480569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.284 [2024-07-21 11:30:16.480588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.284 [2024-07-21 11:30:16.480659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.284 [2024-07-21 11:30:16.480675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.542 NEW_FUNC[1/671]: 0x4a6970 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:06:47.542 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:47.542 #3 NEW cov: 11538 ft: 11539 corp: 2/28b lim: 45 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:06:47.542 [2024-07-21 11:30:16.820685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.820735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.542 [2024-07-21 11:30:16.820877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.820913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.542 [2024-07-21 11:30:16.821037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.821056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.542 #4 NEW cov: 11656 ft: 12260 corp: 3/55b lim: 45 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 CrossOver- 00:06:47.542 [2024-07-21 11:30:16.870713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.870744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.542 [2024-07-21 11:30:16.870867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.870885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.542 [2024-07-21 11:30:16.871004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.871022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.542 #5 NEW cov: 11662 ft: 12621 corp: 4/85b lim: 45 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:06:47.542 [2024-07-21 11:30:16.910769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.910796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.542 [2024-07-21 11:30:16.910921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77770a77 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.910938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.542 [2024-07-21 11:30:16.911059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.911079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.542 #6 NEW cov: 11747 ft: 12896 corp: 5/113b lim: 45 exec/s: 0 rss: 68Mb L: 28/30 MS: 1 InsertByte- 00:06:47.542 [2024-07-21 11:30:16.960617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.542 [2024-07-21 11:30:16.960645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.543 [2024-07-21 11:30:16.960759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.543 [2024-07-21 11:30:16.960776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.800 #7 NEW cov: 11747 ft: 13247 corp: 6/136b lim: 45 exec/s: 0 rss: 68Mb L: 23/30 MS: 1 EraseBytes- 00:06:47.800 [2024-07-21 11:30:17.001291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.001320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.800 [2024-07-21 11:30:17.001450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.001468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.800 [2024-07-21 11:30:17.001573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.001589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.800 [2024-07-21 11:30:17.001707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.001724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.800 #8 NEW cov: 11747 ft: 13634 corp: 7/175b lim: 45 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:06:47.800 [2024-07-21 11:30:17.050825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.050853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.800 [2024-07-21 11:30:17.050980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.050999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.800 #9 NEW cov: 11747 ft: 13728 corp: 8/198b lim: 45 exec/s: 0 rss: 69Mb L: 23/39 MS: 1 ShuffleBytes- 00:06:47.800 [2024-07-21 11:30:17.101589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.101615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.800 [2024-07-21 11:30:17.101727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77770a77 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.101742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.800 [2024-07-21 11:30:17.101871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.800 [2024-07-21 11:30:17.101886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.801 [2024-07-21 11:30:17.101985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.801 [2024-07-21 11:30:17.102001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.801 #10 NEW cov: 11747 ft: 13760 corp: 9/241b lim: 45 exec/s: 0 rss: 69Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:06:47.801 [2024-07-21 11:30:17.150938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.801 [2024-07-21 11:30:17.150966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.801 #14 NEW cov: 11747 ft: 14519 corp: 10/257b lim: 45 exec/s: 0 rss: 69Mb L: 16/43 MS: 4 ChangeBit-CrossOver-EraseBytes-InsertRepeatedBytes- 00:06:47.801 [2024-07-21 11:30:17.191534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.801 [2024-07-21 11:30:17.191561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.801 [2024-07-21 11:30:17.191677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.801 [2024-07-21 11:30:17.191694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.801 [2024-07-21 11:30:17.191817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:777777a2 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.801 [2024-07-21 11:30:17.191834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.801 #15 NEW cov: 11747 ft: 14587 corp: 11/284b lim: 45 exec/s: 0 rss: 69Mb L: 27/43 MS: 1 ChangeByte- 00:06:48.058 [2024-07-21 11:30:17.231715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.058 [2024-07-21 11:30:17.231741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.058 [2024-07-21 11:30:17.231872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.058 [2024-07-21 11:30:17.231888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.058 [2024-07-21 11:30:17.232001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.058 [2024-07-21 11:30:17.232016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.058 #21 NEW cov: 11747 ft: 14600 corp: 12/314b lim: 45 exec/s: 0 rss: 69Mb L: 30/43 MS: 1 ChangeBinInt- 00:06:48.058 [2024-07-21 11:30:17.271775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.058 [2024-07-21 11:30:17.271801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.058 [2024-07-21 11:30:17.271926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77770a77 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.058 [2024-07-21 11:30:17.271948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.058 [2024-07-21 11:30:17.272066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.058 [2024-07-21 11:30:17.272084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.059 #22 NEW cov: 11747 ft: 14605 corp: 13/342b lim: 45 exec/s: 0 rss: 69Mb L: 28/43 MS: 1 ShuffleBytes- 00:06:48.059 [2024-07-21 11:30:17.311918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7777c077 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.311944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.312058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.312075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.312193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.312210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.059 #23 NEW cov: 11747 ft: 14627 corp: 14/372b lim: 45 exec/s: 0 rss: 69Mb L: 30/43 MS: 1 ChangeByte- 00:06:48.059 [2024-07-21 11:30:17.352013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.352040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.352160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.352187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.352310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777773 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.352328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.059 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:48.059 #24 NEW cov: 11770 ft: 14674 corp: 15/399b lim: 45 exec/s: 0 rss: 69Mb L: 27/43 MS: 1 ChangeBit- 00:06:48.059 [2024-07-21 11:30:17.392185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.392213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.392326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.392343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.392455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:0a770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.392472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.059 #25 NEW cov: 11770 ft: 14691 corp: 16/427b lim: 45 exec/s: 0 rss: 69Mb L: 28/43 MS: 1 CopyPart- 00:06:48.059 [2024-07-21 11:30:17.442126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.442153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.059 [2024-07-21 11:30:17.442279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.059 [2024-07-21 11:30:17.442295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.059 #26 NEW cov: 11770 ft: 14741 corp: 17/448b lim: 45 exec/s: 26 rss: 69Mb L: 21/43 MS: 1 EraseBytes- 00:06:48.316 [2024-07-21 11:30:17.492010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.492039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.316 #27 NEW cov: 11770 ft: 14782 corp: 18/460b lim: 45 exec/s: 27 rss: 69Mb L: 12/43 MS: 1 EraseBytes- 00:06:48.316 [2024-07-21 11:30:17.542160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.542187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.316 #28 NEW cov: 11770 ft: 14797 corp: 19/471b lim: 45 exec/s: 28 rss: 70Mb L: 11/43 MS: 1 EraseBytes- 00:06:48.316 [2024-07-21 11:30:17.582786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.582812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.316 [2024-07-21 11:30:17.582936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777700 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.582953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.316 [2024-07-21 11:30:17.583070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.583087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.316 #29 NEW cov: 11770 ft: 14824 corp: 20/502b lim: 45 exec/s: 29 rss: 70Mb L: 31/43 MS: 1 InsertByte- 00:06:48.316 [2024-07-21 11:30:17.622970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7777c077 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.622998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.316 [2024-07-21 11:30:17.623113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.623131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.316 [2024-07-21 11:30:17.623248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.623263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.316 #30 NEW cov: 11770 ft: 14836 corp: 21/532b lim: 45 exec/s: 30 rss: 70Mb L: 30/43 MS: 1 CopyPart- 00:06:48.316 [2024-07-21 11:30:17.662481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.662511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.316 #31 NEW cov: 11770 ft: 14949 corp: 22/549b lim: 45 exec/s: 31 rss: 70Mb L: 17/43 MS: 1 EraseBytes- 00:06:48.316 [2024-07-21 11:30:17.713204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.713235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.316 [2024-07-21 11:30:17.713356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.316 [2024-07-21 11:30:17.713373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.317 [2024-07-21 11:30:17.713486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.317 [2024-07-21 11:30:17.713504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.317 #32 NEW cov: 11770 ft: 14958 corp: 23/576b lim: 45 exec/s: 32 rss: 70Mb L: 27/43 MS: 1 CrossOver- 00:06:48.575 [2024-07-21 11:30:17.753344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7777c077 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.753371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.753494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.753511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.753624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.753639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.575 #33 NEW cov: 11770 ft: 14975 corp: 24/606b lim: 45 exec/s: 33 rss: 70Mb L: 30/43 MS: 1 ShuffleBytes- 00:06:48.575 [2024-07-21 11:30:17.793507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7777c077 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.793534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.793657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.793674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.793795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.793813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.575 #34 NEW cov: 11770 ft: 14994 corp: 25/636b lim: 45 exec/s: 34 rss: 70Mb L: 30/43 MS: 1 ChangeBit- 00:06:48.575 [2024-07-21 11:30:17.833661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.833688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.833817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777700 cdw11:f7770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.833835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.833972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.833990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.575 #40 NEW cov: 11770 ft: 15074 corp: 26/667b lim: 45 exec/s: 40 rss: 70Mb L: 31/43 MS: 1 ChangeBit- 00:06:48.575 [2024-07-21 11:30:17.883754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.883782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.883920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777773 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.883940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.884076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.884093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.575 #41 NEW cov: 11770 ft: 15077 corp: 27/694b lim: 45 exec/s: 41 rss: 70Mb L: 27/43 MS: 1 CrossOver- 00:06:48.575 [2024-07-21 11:30:17.924107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.924135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.924258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77770a77 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.924276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.924400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.924418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.924531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.924548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.575 #42 NEW cov: 11770 ft: 15109 corp: 28/737b lim: 45 exec/s: 42 rss: 70Mb L: 43/43 MS: 1 ChangeByte- 00:06:48.575 [2024-07-21 11:30:17.974242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.974269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.974395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.974413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.974516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.974534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.575 [2024-07-21 11:30:17.974649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:53535353 cdw11:53530003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.575 [2024-07-21 11:30:17.974667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.575 #43 NEW cov: 11770 ft: 15111 corp: 29/777b lim: 45 exec/s: 43 rss: 70Mb L: 40/43 MS: 1 InsertRepeatedBytes- 00:06:48.833 [2024-07-21 11:30:18.013842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.013869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.013989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777700 cdw11:77be0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.014008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.833 #44 NEW cov: 11770 ft: 15121 corp: 30/796b lim: 45 exec/s: 44 rss: 70Mb L: 19/43 MS: 1 EraseBytes- 00:06:48.833 [2024-07-21 11:30:18.053872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.053899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.054014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.054029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.833 #45 NEW cov: 11770 ft: 15122 corp: 31/819b lim: 45 exec/s: 45 rss: 70Mb L: 23/43 MS: 1 EraseBytes- 00:06:48.833 [2024-07-21 11:30:18.094258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7777c077 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.094287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.094400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77230003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.094417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.094536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.094554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.833 #46 NEW cov: 11770 ft: 15137 corp: 32/849b lim: 45 exec/s: 46 rss: 70Mb L: 30/43 MS: 1 ChangeByte- 00:06:48.833 [2024-07-21 11:30:18.144465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:bebe0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.144492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.144602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.144618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.144735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.144751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.144871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:77777777 cdw11:77770005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.144887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.833 #47 NEW cov: 11770 ft: 15187 corp: 33/887b lim: 45 exec/s: 47 rss: 70Mb L: 38/43 MS: 1 CopyPart- 00:06:48.833 [2024-07-21 11:30:18.184802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.184830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.184948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77770a77 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.184965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.185074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.185090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.185207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.185224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.833 #48 NEW cov: 11770 ft: 15200 corp: 34/931b lim: 45 exec/s: 48 rss: 70Mb L: 44/44 MS: 1 InsertByte- 00:06:48.833 [2024-07-21 11:30:18.234718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.234747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.234868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:7777f777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.234885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.833 [2024-07-21 11:30:18.234996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.833 [2024-07-21 11:30:18.235014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.833 #49 NEW cov: 11770 ft: 15204 corp: 35/958b lim: 45 exec/s: 49 rss: 70Mb L: 27/44 MS: 1 ChangeBit- 00:06:49.093 [2024-07-21 11:30:18.274826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.274854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.274973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:dadadada cdw11:dada0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.274990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.275109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:dadadada cdw11:daff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.275124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.093 #50 NEW cov: 11770 ft: 15216 corp: 36/990b lim: 45 exec/s: 50 rss: 70Mb L: 32/44 MS: 1 InsertRepeatedBytes- 00:06:49.093 [2024-07-21 11:30:18.314988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.315016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.315130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.315147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.315268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:777777a2 cdw11:777f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.315285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.093 #51 NEW cov: 11770 ft: 15224 corp: 37/1017b lim: 45 exec/s: 51 rss: 70Mb L: 27/44 MS: 1 ChangeBit- 00:06:49.093 [2024-07-21 11:30:18.355035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7777c077 cdw11:77890004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.355061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.355178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.355194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.355308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77be7777 cdw11:bebe0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.355324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.093 #52 NEW cov: 11770 ft: 15230 corp: 38/1044b lim: 45 exec/s: 52 rss: 70Mb L: 27/44 MS: 1 EraseBytes- 00:06:49.093 [2024-07-21 11:30:18.394620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff7777 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.394648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.093 #53 NEW cov: 11770 ft: 15239 corp: 39/1056b lim: 45 exec/s: 53 rss: 70Mb L: 12/44 MS: 1 CrossOver- 00:06:49.093 [2024-07-21 11:30:18.445462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.445489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.445613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77770a77 cdw11:77770001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.445630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.445744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.445766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.093 [2024-07-21 11:30:18.445882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.093 [2024-07-21 11:30:18.445899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.093 #54 NEW cov: 11770 ft: 15248 corp: 40/1099b lim: 45 exec/s: 27 rss: 70Mb L: 43/44 MS: 1 ChangeBit- 00:06:49.093 #54 DONE cov: 11770 ft: 15248 corp: 40/1099b lim: 45 exec/s: 27 rss: 70Mb 00:06:49.093 Done 54 runs in 2 second(s) 00:06:49.352 11:30:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:06:49.352 11:30:18 -- ../common.sh@72 -- # (( i++ )) 00:06:49.352 11:30:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:49.352 11:30:18 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:06:49.352 11:30:18 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:06:49.352 11:30:18 -- nvmf/run.sh@24 -- # local timen=1 00:06:49.352 11:30:18 -- nvmf/run.sh@25 -- # local core=0x1 00:06:49.352 11:30:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:49.352 11:30:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:06:49.352 11:30:18 -- nvmf/run.sh@29 -- # printf %02d 6 00:06:49.352 11:30:18 -- nvmf/run.sh@29 -- # port=4406 00:06:49.352 11:30:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:49.352 11:30:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:06:49.352 11:30:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:49.352 11:30:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:06:49.352 [2024-07-21 11:30:18.620012] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:49.352 [2024-07-21 11:30:18.620087] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2061993 ] 00:06:49.352 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.610 [2024-07-21 11:30:18.799018] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.610 [2024-07-21 11:30:18.818865] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:49.610 [2024-07-21 11:30:18.818985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.610 [2024-07-21 11:30:18.870495] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:49.610 [2024-07-21 11:30:18.886822] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:06:49.610 INFO: Running with entropic power schedule (0xFF, 100). 00:06:49.610 INFO: Seed: 3746940528 00:06:49.610 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:49.610 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:49.610 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:49.610 INFO: A corpus is not provided, starting from an empty corpus 00:06:49.610 #2 INITED exec/s: 0 rss: 61Mb 00:06:49.610 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:49.610 This may also happen if the target rejected all inputs we tried so far 00:06:49.610 [2024-07-21 11:30:18.932008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:49.610 [2024-07-21 11:30:18.932036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.869 NEW_FUNC[1/669]: 0x4a9180 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:06:49.869 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:49.869 #9 NEW cov: 11460 ft: 11461 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 2 ShuffleBytes-CrossOver- 00:06:49.869 [2024-07-21 11:30:19.242726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e90a cdw11:00000000 00:06:49.869 [2024-07-21 11:30:19.242758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.869 #10 NEW cov: 11573 ft: 11993 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:06:49.869 [2024-07-21 11:30:19.282781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:49.869 [2024-07-21 11:30:19.282807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 #12 NEW cov: 11579 ft: 12166 corp: 4/8b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 2 ShuffleBytes-CrossOver- 00:06:50.128 [2024-07-21 11:30:19.312879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10a cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.312904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 #13 NEW cov: 11664 ft: 12365 corp: 5/10b lim: 10 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeBit- 00:06:50.128 [2024-07-21 11:30:19.352979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.353003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 #14 NEW cov: 11664 ft: 12483 corp: 6/13b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:06:50.128 [2024-07-21 11:30:19.393094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10a cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.393119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 #15 NEW cov: 11664 ft: 12567 corp: 7/16b lim: 10 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:06:50.128 [2024-07-21 11:30:19.433234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a3d cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.433258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 #16 NEW cov: 11664 ft: 12661 corp: 8/19b lim: 10 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:06:50.128 [2024-07-21 11:30:19.473684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e1a7 cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.473710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 [2024-07-21 11:30:19.473761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a7a7 cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.473775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.128 [2024-07-21 11:30:19.473826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a7a7 cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.473839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.128 [2024-07-21 11:30:19.473890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000a70a cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.473903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.128 #17 NEW cov: 11664 ft: 13003 corp: 9/28b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:50.128 [2024-07-21 11:30:19.513413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a5d cdw11:00000000 00:06:50.128 [2024-07-21 11:30:19.513437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.128 #18 NEW cov: 11664 ft: 13033 corp: 10/31b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ChangeByte- 00:06:50.387 [2024-07-21 11:30:19.553569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.553595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.387 #19 NEW cov: 11664 ft: 13152 corp: 11/34b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ChangeByte- 00:06:50.387 [2024-07-21 11:30:19.593633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dc0a cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.593658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.387 #20 NEW cov: 11664 ft: 13189 corp: 12/37b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ShuffleBytes- 00:06:50.387 [2024-07-21 11:30:19.633769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e90b cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.633796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.387 #21 NEW cov: 11664 ft: 13245 corp: 13/39b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 ChangeBinInt- 00:06:50.387 [2024-07-21 11:30:19.674007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.674033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.387 [2024-07-21 11:30:19.674084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.674098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.387 #22 NEW cov: 11664 ft: 13403 corp: 14/43b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 CopyPart- 00:06:50.387 [2024-07-21 11:30:19.713985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.714010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.387 #23 NEW cov: 11664 ft: 13439 corp: 15/46b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ShuffleBytes- 00:06:50.387 [2024-07-21 11:30:19.754092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10c cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.754118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.387 #24 NEW cov: 11664 ft: 13488 corp: 16/49b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ChangeBinInt- 00:06:50.387 [2024-07-21 11:30:19.794218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:06:50.387 [2024-07-21 11:30:19.794244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 #25 NEW cov: 11664 ft: 13505 corp: 17/52b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ChangeBit- 00:06:50.646 [2024-07-21 11:30:19.834341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.834367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:50.646 #26 NEW cov: 11687 ft: 13632 corp: 18/55b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 ChangeByte- 00:06:50.646 [2024-07-21 11:30:19.874459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.874484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 #27 NEW cov: 11687 ft: 13737 corp: 19/57b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 CopyPart- 00:06:50.646 [2024-07-21 11:30:19.914602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae1 cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.914628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 #28 NEW cov: 11687 ft: 13768 corp: 20/60b lim: 10 exec/s: 28 rss: 69Mb L: 3/9 MS: 1 ShuffleBytes- 00:06:50.646 [2024-07-21 11:30:19.944898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.944924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 [2024-07-21 11:30:19.944992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.945006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.646 [2024-07-21 11:30:19.945059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.945072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.646 #29 NEW cov: 11687 ft: 13914 corp: 21/67b lim: 10 exec/s: 29 rss: 69Mb L: 7/9 MS: 1 CopyPart- 00:06:50.646 [2024-07-21 11:30:19.985016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e90b cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.985042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 [2024-07-21 11:30:19.985096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.985109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.646 [2024-07-21 11:30:19.985159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:06:50.646 [2024-07-21 11:30:19.985173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.646 #30 NEW cov: 11687 ft: 13929 corp: 22/73b lim: 10 exec/s: 30 rss: 69Mb L: 6/9 MS: 1 InsertRepeatedBytes- 00:06:50.646 [2024-07-21 11:30:20.025093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:06:50.646 [2024-07-21 11:30:20.025121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.646 [2024-07-21 11:30:20.025173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:20.025203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.646 #31 NEW cov: 11687 ft: 13955 corp: 23/77b lim: 10 exec/s: 31 rss: 69Mb L: 4/9 MS: 1 ChangeBit- 00:06:50.646 [2024-07-21 11:30:20.065056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e90a cdw11:00000000 00:06:50.646 [2024-07-21 11:30:20.065083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.905 #32 NEW cov: 11687 ft: 13970 corp: 24/79b lim: 10 exec/s: 32 rss: 69Mb L: 2/9 MS: 1 ShuffleBytes- 00:06:50.905 [2024-07-21 11:30:20.105186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10a cdw11:00000000 00:06:50.905 [2024-07-21 11:30:20.105212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.905 #33 NEW cov: 11687 ft: 13985 corp: 25/82b lim: 10 exec/s: 33 rss: 70Mb L: 3/9 MS: 1 ShuffleBytes- 00:06:50.905 [2024-07-21 11:30:20.145314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d20a cdw11:00000000 00:06:50.905 [2024-07-21 11:30:20.145339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.905 #34 NEW cov: 11687 ft: 14022 corp: 26/85b lim: 10 exec/s: 34 rss: 70Mb L: 3/9 MS: 1 ChangeByte- 00:06:50.905 [2024-07-21 11:30:20.185381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10c cdw11:00000000 00:06:50.905 [2024-07-21 11:30:20.185406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.905 #35 NEW cov: 11687 ft: 14033 corp: 27/88b lim: 10 exec/s: 35 rss: 70Mb L: 3/9 MS: 1 CrossOver- 00:06:50.905 [2024-07-21 11:30:20.225553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:06:50.905 [2024-07-21 11:30:20.225578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.905 #36 NEW cov: 11687 ft: 14071 corp: 28/91b lim: 10 exec/s: 36 rss: 70Mb L: 3/9 MS: 1 CMP- DE: "\377\377"- 00:06:50.905 [2024-07-21 11:30:20.265599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:50.905 [2024-07-21 11:30:20.265624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.905 #37 NEW cov: 11687 ft: 14089 corp: 29/93b lim: 10 exec/s: 37 rss: 70Mb L: 2/9 MS: 1 EraseBytes- 00:06:50.905 [2024-07-21 11:30:20.305696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dc0a cdw11:00000000 00:06:50.905 [2024-07-21 11:30:20.305722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.164 #38 NEW cov: 11687 ft: 14104 corp: 30/96b lim: 10 exec/s: 38 rss: 70Mb L: 3/9 MS: 1 ChangeByte- 00:06:51.164 [2024-07-21 11:30:20.345886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a3d cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.345911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.164 #39 NEW cov: 11687 ft: 14140 corp: 31/99b lim: 10 exec/s: 39 rss: 70Mb L: 3/9 MS: 1 ChangeBit- 00:06:51.164 [2024-07-21 11:30:20.385956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.385980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.164 #40 NEW cov: 11687 ft: 14152 corp: 32/101b lim: 10 exec/s: 40 rss: 70Mb L: 2/9 MS: 1 CrossOver- 00:06:51.164 [2024-07-21 11:30:20.426412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.426436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.426507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.426521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.426571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.426584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.426635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.426649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.164 #41 NEW cov: 11687 ft: 14164 corp: 33/110b lim: 10 exec/s: 41 rss: 70Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:51.164 [2024-07-21 11:30:20.466301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e15b cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.466326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.466379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000c0a cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.466392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.164 #42 NEW cov: 11687 ft: 14165 corp: 34/114b lim: 10 exec/s: 42 rss: 70Mb L: 4/9 MS: 1 InsertByte- 00:06:51.164 [2024-07-21 11:30:20.506753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.506778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.506829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.506842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.506892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.506905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.506955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002900 cdw11:00000000 00:06:51.164 [2024-07-21 11:30:20.506967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.164 [2024-07-21 11:30:20.507016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:51.165 [2024-07-21 11:30:20.507029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.165 #43 NEW cov: 11687 ft: 14220 corp: 35/124b lim: 10 exec/s: 43 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:06:51.165 [2024-07-21 11:30:20.546653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006f0a cdw11:00000000 00:06:51.165 [2024-07-21 11:30:20.546678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.165 [2024-07-21 11:30:20.546731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:51.165 [2024-07-21 11:30:20.546744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.165 [2024-07-21 11:30:20.546796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:51.165 [2024-07-21 11:30:20.546808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.165 #44 NEW cov: 11687 ft: 14230 corp: 36/131b lim: 10 exec/s: 44 rss: 70Mb L: 7/10 MS: 1 ChangeByte- 00:06:51.165 [2024-07-21 11:30:20.586560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dc0a cdw11:00000000 00:06:51.165 [2024-07-21 11:30:20.586585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 #45 NEW cov: 11687 ft: 14247 corp: 37/134b lim: 10 exec/s: 45 rss: 70Mb L: 3/10 MS: 1 CopyPart- 00:06:51.423 [2024-07-21 11:30:20.626876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e9e1 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.626901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.626968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a5b cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.626982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.627033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000c0a cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.627046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.423 #46 NEW cov: 11687 ft: 14259 corp: 38/140b lim: 10 exec/s: 46 rss: 70Mb L: 6/10 MS: 1 CrossOver- 00:06:51.423 [2024-07-21 11:30:20.666996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.667021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.667073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.667086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.667138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.667152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.423 #47 NEW cov: 11687 ft: 14264 corp: 39/147b lim: 10 exec/s: 47 rss: 70Mb L: 7/10 MS: 1 EraseBytes- 00:06:51.423 [2024-07-21 11:30:20.707138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e9e1 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.707162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.707214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a06 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.707227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.707279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000c0a cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.707291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.423 #48 NEW cov: 11687 ft: 14270 corp: 40/153b lim: 10 exec/s: 48 rss: 70Mb L: 6/10 MS: 1 ChangeBinInt- 00:06:51.423 [2024-07-21 11:30:20.747029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a3d cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.747054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 #49 NEW cov: 11687 ft: 14274 corp: 41/156b lim: 10 exec/s: 49 rss: 70Mb L: 3/10 MS: 1 ChangeBit- 00:06:51.423 [2024-07-21 11:30:20.787567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.787591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.787644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.787660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.787712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.787725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.787777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.787789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.787839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000290a cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.787852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.423 #50 NEW cov: 11687 ft: 14279 corp: 42/166b lim: 10 exec/s: 50 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:51.423 [2024-07-21 11:30:20.827682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e10a cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.827708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.827777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000aca cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.827801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.423 [2024-07-21 11:30:20.827851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000caca cdw11:00000000 00:06:51.423 [2024-07-21 11:30:20.827864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.424 [2024-07-21 11:30:20.827914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000caca cdw11:00000000 00:06:51.424 [2024-07-21 11:30:20.827927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.424 [2024-07-21 11:30:20.827976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000caca cdw11:00000000 00:06:51.424 [2024-07-21 11:30:20.827989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.424 #51 NEW cov: 11687 ft: 14284 corp: 43/176b lim: 10 exec/s: 51 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:06:51.682 [2024-07-21 11:30:20.867593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006f0a cdw11:00000000 00:06:51.682 [2024-07-21 11:30:20.867618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.682 [2024-07-21 11:30:20.867686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004a0a cdw11:00000000 00:06:51.682 [2024-07-21 11:30:20.867700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.682 [2024-07-21 11:30:20.867759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:51.682 [2024-07-21 11:30:20.867772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.682 #52 NEW cov: 11687 ft: 14291 corp: 44/183b lim: 10 exec/s: 52 rss: 70Mb L: 7/10 MS: 1 ChangeBit- 00:06:51.682 [2024-07-21 11:30:20.907448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000cdc cdw11:00000000 00:06:51.682 [2024-07-21 11:30:20.907472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.682 #53 NEW cov: 11687 ft: 14364 corp: 45/185b lim: 10 exec/s: 26 rss: 70Mb L: 2/10 MS: 1 EraseBytes- 00:06:51.682 #53 DONE cov: 11687 ft: 14364 corp: 45/185b lim: 10 exec/s: 26 rss: 70Mb 00:06:51.682 ###### Recommended dictionary. ###### 00:06:51.682 "\377\377" # Uses: 0 00:06:51.682 ###### End of recommended dictionary. ###### 00:06:51.682 Done 53 runs in 2 second(s) 00:06:51.682 11:30:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:06:51.682 11:30:21 -- ../common.sh@72 -- # (( i++ )) 00:06:51.682 11:30:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:51.682 11:30:21 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:06:51.682 11:30:21 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:06:51.682 11:30:21 -- nvmf/run.sh@24 -- # local timen=1 00:06:51.682 11:30:21 -- nvmf/run.sh@25 -- # local core=0x1 00:06:51.682 11:30:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:51.682 11:30:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:06:51.682 11:30:21 -- nvmf/run.sh@29 -- # printf %02d 7 00:06:51.682 11:30:21 -- nvmf/run.sh@29 -- # port=4407 00:06:51.682 11:30:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:51.682 11:30:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:06:51.682 11:30:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:51.682 11:30:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:06:51.682 [2024-07-21 11:30:21.085294] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:51.682 [2024-07-21 11:30:21.085363] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2062306 ] 00:06:51.940 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.940 [2024-07-21 11:30:21.269961] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.940 [2024-07-21 11:30:21.290206] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:51.940 [2024-07-21 11:30:21.290348] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.940 [2024-07-21 11:30:21.342013] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:51.940 [2024-07-21 11:30:21.358360] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:06:52.198 INFO: Running with entropic power schedule (0xFF, 100). 00:06:52.198 INFO: Seed: 1923953397 00:06:52.198 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:52.198 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:52.198 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:52.198 INFO: A corpus is not provided, starting from an empty corpus 00:06:52.198 #2 INITED exec/s: 0 rss: 60Mb 00:06:52.198 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:52.198 This may also happen if the target rejected all inputs we tried so far 00:06:52.198 [2024-07-21 11:30:21.434288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:06:52.198 [2024-07-21 11:30:21.434326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.456 NEW_FUNC[1/669]: 0x4a9b70 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:06:52.456 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:52.456 #3 NEW cov: 11460 ft: 11461 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:06:52.456 [2024-07-21 11:30:21.775293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002828 cdw11:00000000 00:06:52.456 [2024-07-21 11:30:21.775344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.456 #7 NEW cov: 11573 ft: 12068 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 4 CopyPart-ChangeByte-ChangeBit-CopyPart- 00:06:52.456 [2024-07-21 11:30:21.815214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:06:52.456 [2024-07-21 11:30:21.815244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.456 #8 NEW cov: 11579 ft: 12428 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:06:52.456 [2024-07-21 11:30:21.855277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002808 cdw11:00000000 00:06:52.456 [2024-07-21 11:30:21.855306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 #9 NEW cov: 11664 ft: 12782 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:06:52.714 [2024-07-21 11:30:21.895413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002808 cdw11:00000000 00:06:52.714 [2024-07-21 11:30:21.895440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 #10 NEW cov: 11664 ft: 12873 corp: 6/12b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CopyPart- 00:06:52.714 [2024-07-21 11:30:21.935580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002808 cdw11:00000000 00:06:52.714 [2024-07-21 11:30:21.935607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 #11 NEW cov: 11664 ft: 12992 corp: 7/15b lim: 10 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 ChangeByte- 00:06:52.714 [2024-07-21 11:30:21.975740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:06:52.714 [2024-07-21 11:30:21.975768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 #12 NEW cov: 11664 ft: 13070 corp: 8/17b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:06:52.715 [2024-07-21 11:30:22.015800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.015830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.715 #13 NEW cov: 11664 ft: 13104 corp: 9/19b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:06:52.715 [2024-07-21 11:30:22.055986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.056013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.715 #14 NEW cov: 11664 ft: 13187 corp: 10/21b lim: 10 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:06:52.715 [2024-07-21 11:30:22.096755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.096783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.715 [2024-07-21 11:30:22.096906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.096922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.715 [2024-07-21 11:30:22.097033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.097052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.715 [2024-07-21 11:30:22.097164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002808 cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.097181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.715 #15 NEW cov: 11664 ft: 13526 corp: 11/30b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:52.715 [2024-07-21 11:30:22.136346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a82 cdw11:00000000 00:06:52.715 [2024-07-21 11:30:22.136374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 #17 NEW cov: 11664 ft: 13564 corp: 12/32b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 2 EraseBytes-InsertByte- 00:06:52.973 [2024-07-21 11:30:22.176301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a4a cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.176327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 #18 NEW cov: 11664 ft: 13588 corp: 13/34b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 ChangeBit- 00:06:52.973 [2024-07-21 11:30:22.216650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000da28 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.216676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.216792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000828 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.216810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.973 #22 NEW cov: 11664 ft: 13764 corp: 14/38b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 4 ShuffleBytes-ChangeByte-ChangeBit-CrossOver- 00:06:52.973 [2024-07-21 11:30:22.257022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.257049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.257163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.257178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.257293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002808 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.257310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.973 #23 NEW cov: 11664 ft: 13922 corp: 15/45b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 CMP- DE: "\177\000\000\000"- 00:06:52.973 [2024-07-21 11:30:22.297070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.297098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.297216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.297233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.297343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002808 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.297359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.973 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:52.973 #24 NEW cov: 11687 ft: 13966 corp: 16/51b lim: 10 exec/s: 0 rss: 69Mb L: 6/9 MS: 1 PersAutoDict- DE: "\177\000\000\000"- 00:06:52.973 [2024-07-21 11:30:22.336980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002828 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.337007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.337127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a0a cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.337154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.973 #25 NEW cov: 11687 ft: 14003 corp: 17/55b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 CrossOver- 00:06:52.973 [2024-07-21 11:30:22.377241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.377268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.377378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.377395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.973 [2024-07-21 11:30:22.377512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004128 cdw11:00000000 00:06:52.973 [2024-07-21 11:30:22.377528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.231 #26 NEW cov: 11687 ft: 14027 corp: 18/62b lim: 10 exec/s: 26 rss: 70Mb L: 7/9 MS: 1 InsertByte- 00:06:53.231 [2024-07-21 11:30:22.417061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002828 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.417087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.232 #27 NEW cov: 11687 ft: 14035 corp: 19/64b lim: 10 exec/s: 27 rss: 70Mb L: 2/9 MS: 1 EraseBytes- 00:06:53.232 [2024-07-21 11:30:22.457365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a82 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.457392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.457513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a82 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.457530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.232 #28 NEW cov: 11687 ft: 14077 corp: 20/68b lim: 10 exec/s: 28 rss: 70Mb L: 4/9 MS: 1 CopyPart- 00:06:53.232 [2024-07-21 11:30:22.497695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.497721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.497829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000028 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.497846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.497956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002808 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.497973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.232 #29 NEW cov: 11687 ft: 14087 corp: 21/75b lim: 10 exec/s: 29 rss: 70Mb L: 7/9 MS: 1 CopyPart- 00:06:53.232 [2024-07-21 11:30:22.538271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.538299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.538401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.538417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.538535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000287f cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.538551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.538660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.538677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.538781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.538798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:53.232 #30 NEW cov: 11687 ft: 14133 corp: 22/85b lim: 10 exec/s: 30 rss: 70Mb L: 10/10 MS: 1 PersAutoDict- DE: "\177\000\000\000"- 00:06:53.232 [2024-07-21 11:30:22.577678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000da28 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.577706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.577807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000841 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.577824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.232 #31 NEW cov: 11687 ft: 14170 corp: 23/90b lim: 10 exec/s: 31 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:06:53.232 [2024-07-21 11:30:22.618399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.618426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.618531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.618546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.618662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.618677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.618782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.618797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.232 [2024-07-21 11:30:22.618861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00002808 cdw11:00000000 00:06:53.232 [2024-07-21 11:30:22.618877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:53.232 #32 NEW cov: 11687 ft: 14211 corp: 24/100b lim: 10 exec/s: 32 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:53.490 [2024-07-21 11:30:22.658413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002aff cdw11:00000000 00:06:53.490 [2024-07-21 11:30:22.658440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.490 [2024-07-21 11:30:22.658555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:53.490 [2024-07-21 11:30:22.658571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.490 [2024-07-21 11:30:22.658689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:53.490 [2024-07-21 11:30:22.658706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.490 #33 NEW cov: 11687 ft: 14303 corp: 25/107b lim: 10 exec/s: 33 rss: 70Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:06:53.490 [2024-07-21 11:30:22.697856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:53.490 [2024-07-21 11:30:22.697883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.490 #34 NEW cov: 11687 ft: 14357 corp: 26/109b lim: 10 exec/s: 34 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:06:53.490 [2024-07-21 11:30:22.738300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000282c cdw11:00000000 00:06:53.490 [2024-07-21 11:30:22.738326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.490 [2024-07-21 11:30:22.738435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a0a cdw11:00000000 00:06:53.490 [2024-07-21 11:30:22.738455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.490 #35 NEW cov: 11687 ft: 14376 corp: 27/113b lim: 10 exec/s: 35 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:06:53.490 [2024-07-21 11:30:22.778082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002808 cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.778110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 #36 NEW cov: 11687 ft: 14391 corp: 28/116b lim: 10 exec/s: 36 rss: 70Mb L: 3/10 MS: 1 ChangeBit- 00:06:53.491 [2024-07-21 11:30:22.819115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.819142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 [2024-07-21 11:30:22.819253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.819269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.491 [2024-07-21 11:30:22.819382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.819398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.491 [2024-07-21 11:30:22.819518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.819534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.491 [2024-07-21 11:30:22.819645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00002808 cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.819662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:53.491 #37 NEW cov: 11687 ft: 14404 corp: 29/126b lim: 10 exec/s: 37 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:53.491 [2024-07-21 11:30:22.858591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.858619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.491 [2024-07-21 11:30:22.858723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.858738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.491 #39 NEW cov: 11687 ft: 14432 corp: 30/131b lim: 10 exec/s: 39 rss: 70Mb L: 5/10 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:06:53.491 [2024-07-21 11:30:22.898471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000820a cdw11:00000000 00:06:53.491 [2024-07-21 11:30:22.898500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.749 #40 NEW cov: 11687 ft: 14439 corp: 31/133b lim: 10 exec/s: 40 rss: 70Mb L: 2/10 MS: 1 CrossOver- 00:06:53.749 [2024-07-21 11:30:22.939012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ada cdw11:00000000 00:06:53.749 [2024-07-21 11:30:22.939040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.749 [2024-07-21 11:30:22.939159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002808 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:22.939176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.749 [2024-07-21 11:30:22.939290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004128 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:22.939306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.749 #41 NEW cov: 11687 ft: 14441 corp: 32/139b lim: 10 exec/s: 41 rss: 70Mb L: 6/10 MS: 1 CrossOver- 00:06:53.749 [2024-07-21 11:30:22.988858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002828 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:22.988887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.749 [2024-07-21 11:30:22.989001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a2c cdw11:00000000 00:06:53.749 [2024-07-21 11:30:22.989018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.749 #42 NEW cov: 11687 ft: 14449 corp: 33/143b lim: 10 exec/s: 42 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:06:53.749 [2024-07-21 11:30:23.029168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000adb cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.029197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.749 [2024-07-21 11:30:23.029308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a82 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.029324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.749 #43 NEW cov: 11687 ft: 14458 corp: 34/147b lim: 10 exec/s: 43 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:06:53.749 [2024-07-21 11:30:23.079024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a4a cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.079052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.749 #44 NEW cov: 11687 ft: 14461 corp: 35/150b lim: 10 exec/s: 44 rss: 70Mb L: 3/10 MS: 1 InsertByte- 00:06:53.749 [2024-07-21 11:30:23.119560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ab82 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.119589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.749 [2024-07-21 11:30:23.119696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008282 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.119715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.749 [2024-07-21 11:30:23.119830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008282 cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.119846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.749 #49 NEW cov: 11687 ft: 14465 corp: 36/157b lim: 10 exec/s: 49 rss: 70Mb L: 7/10 MS: 5 ChangeByte-ChangeByte-CopyPart-ChangeBit-InsertRepeatedBytes- 00:06:53.749 [2024-07-21 11:30:23.159254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:53.749 [2024-07-21 11:30:23.159283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 #51 NEW cov: 11687 ft: 14477 corp: 37/159b lim: 10 exec/s: 51 rss: 70Mb L: 2/10 MS: 2 EraseBytes-CopyPart- 00:06:54.008 [2024-07-21 11:30:23.199304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002808 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.199331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 #52 NEW cov: 11687 ft: 14492 corp: 38/162b lim: 10 exec/s: 52 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:06:54.008 [2024-07-21 11:30:23.239680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000da28 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.239709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.239832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002841 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.239850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.279822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fa28 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.279849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.279971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002841 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.279988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.008 #54 NEW cov: 11687 ft: 14493 corp: 39/167b lim: 10 exec/s: 54 rss: 70Mb L: 5/10 MS: 2 CrossOver-ChangeByte- 00:06:54.008 [2024-07-21 11:30:23.320322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.320349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.320460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.320487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.320599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002808 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.320614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.320734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007e28 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.320751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.008 #55 NEW cov: 11687 ft: 14500 corp: 40/175b lim: 10 exec/s: 55 rss: 70Mb L: 8/10 MS: 1 InsertByte- 00:06:54.008 [2024-07-21 11:30:23.360461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007f00 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.360489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.360598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.360615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.360724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000808 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.360738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.360847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007e28 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.360863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.008 #56 NEW cov: 11687 ft: 14518 corp: 41/183b lim: 10 exec/s: 56 rss: 70Mb L: 8/10 MS: 1 ChangeBinInt- 00:06:54.008 [2024-07-21 11:30:23.400348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007700 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.400375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.400502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.400520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.008 [2024-07-21 11:30:23.400637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004128 cdw11:00000000 00:06:54.008 [2024-07-21 11:30:23.400655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.008 #57 NEW cov: 11687 ft: 14528 corp: 42/190b lim: 10 exec/s: 28 rss: 71Mb L: 7/10 MS: 1 ChangeBit- 00:06:54.008 #57 DONE cov: 11687 ft: 14528 corp: 42/190b lim: 10 exec/s: 28 rss: 71Mb 00:06:54.008 ###### Recommended dictionary. ###### 00:06:54.008 "\177\000\000\000" # Uses: 2 00:06:54.008 ###### End of recommended dictionary. ###### 00:06:54.008 Done 57 runs in 2 second(s) 00:06:54.267 11:30:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:06:54.267 11:30:23 -- ../common.sh@72 -- # (( i++ )) 00:06:54.267 11:30:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:54.267 11:30:23 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:06:54.267 11:30:23 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:06:54.267 11:30:23 -- nvmf/run.sh@24 -- # local timen=1 00:06:54.267 11:30:23 -- nvmf/run.sh@25 -- # local core=0x1 00:06:54.267 11:30:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:54.267 11:30:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:06:54.267 11:30:23 -- nvmf/run.sh@29 -- # printf %02d 8 00:06:54.267 11:30:23 -- nvmf/run.sh@29 -- # port=4408 00:06:54.267 11:30:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:54.267 11:30:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:06:54.267 11:30:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:54.267 11:30:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:06:54.267 [2024-07-21 11:30:23.581264] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:54.267 [2024-07-21 11:30:23.581355] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2062841 ] 00:06:54.267 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.525 [2024-07-21 11:30:23.758651] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.525 [2024-07-21 11:30:23.778006] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:54.525 [2024-07-21 11:30:23.778127] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.525 [2024-07-21 11:30:23.829601] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:54.525 [2024-07-21 11:30:23.845895] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:06:54.525 INFO: Running with entropic power schedule (0xFF, 100). 00:06:54.525 INFO: Seed: 116964127 00:06:54.525 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:54.525 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:54.525 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:54.525 INFO: A corpus is not provided, starting from an empty corpus 00:06:54.525 [2024-07-21 11:30:23.901222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.525 [2024-07-21 11:30:23.901252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.525 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 66Mb 00:06:54.525 [2024-07-21 11:30:23.931693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.525 [2024-07-21 11:30:23.931719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.525 [2024-07-21 11:30:23.931790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.526 [2024-07-21 11:30:23.931804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.526 [2024-07-21 11:30:23.931861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.526 [2024-07-21 11:30:23.931875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.526 [2024-07-21 11:30:23.931932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.526 [2024-07-21 11:30:23.931946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.784 #3 NEW cov: 11601 ft: 12732 corp: 2/5b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:06:54.784 [2024-07-21 11:30:23.981510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:23.981537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:23.981598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:23.981612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.784 #4 NEW cov: 11607 ft: 13241 corp: 3/7b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 EraseBytes- 00:06:54.784 [2024-07-21 11:30:24.021926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.021951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.022026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.022040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.022097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.022110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.022167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.022180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.784 #5 NEW cov: 11692 ft: 13529 corp: 4/11b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ShuffleBytes- 00:06:54.784 [2024-07-21 11:30:24.062079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.062106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.062163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.062177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.062234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.062248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.062305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.062318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.784 #6 NEW cov: 11692 ft: 13624 corp: 5/15b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBit- 00:06:54.784 [2024-07-21 11:30:24.102147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.102173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.102233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.102263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.102324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.102338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.784 [2024-07-21 11:30:24.102394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.102407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.784 #7 NEW cov: 11692 ft: 13696 corp: 6/19b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeByte- 00:06:54.784 [2024-07-21 11:30:24.142271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.784 [2024-07-21 11:30:24.142296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.785 [2024-07-21 11:30:24.142356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.785 [2024-07-21 11:30:24.142370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.785 [2024-07-21 11:30:24.142428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.785 [2024-07-21 11:30:24.142440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.785 [2024-07-21 11:30:24.142501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.785 [2024-07-21 11:30:24.142514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.785 #8 NEW cov: 11692 ft: 13750 corp: 7/23b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBit- 00:06:54.785 [2024-07-21 11:30:24.182059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.785 [2024-07-21 11:30:24.182086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.785 [2024-07-21 11:30:24.182147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:54.785 [2024-07-21 11:30:24.182161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.785 #9 NEW cov: 11692 ft: 13801 corp: 8/25b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 InsertByte- 00:06:55.043 [2024-07-21 11:30:24.222730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.222756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.043 [2024-07-21 11:30:24.222816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.222830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.043 [2024-07-21 11:30:24.222887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.222904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.043 [2024-07-21 11:30:24.222962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.222977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.043 [2024-07-21 11:30:24.223034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.223048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.043 #10 NEW cov: 11692 ft: 13890 corp: 9/30b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertByte- 00:06:55.043 [2024-07-21 11:30:24.262835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.262862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.043 [2024-07-21 11:30:24.262924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.043 [2024-07-21 11:30:24.262938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.262996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.263009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.263067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.263081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.263138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.263152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.044 #11 NEW cov: 11692 ft: 13935 corp: 10/35b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeBit- 00:06:55.044 [2024-07-21 11:30:24.302938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.302964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.303022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.303036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.303094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.303107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.303164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.303178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.303237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.303251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.044 #12 NEW cov: 11692 ft: 13950 corp: 11/40b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 ChangeByte- 00:06:55.044 [2024-07-21 11:30:24.342457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.342482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.044 #13 NEW cov: 11692 ft: 14007 corp: 12/41b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:06:55.044 [2024-07-21 11:30:24.383042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.383068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.383129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.383143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.383198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.383212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.383268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.383281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.044 #14 NEW cov: 11692 ft: 14106 corp: 13/45b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 EraseBytes- 00:06:55.044 [2024-07-21 11:30:24.422810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.422835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.422893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.422907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.044 #15 NEW cov: 11692 ft: 14196 corp: 14/47b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:06:55.044 [2024-07-21 11:30:24.463274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.463300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.463358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.463372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.463429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.463452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.044 [2024-07-21 11:30:24.463508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.044 [2024-07-21 11:30:24.463522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.301 #16 NEW cov: 11692 ft: 14203 corp: 15/51b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ChangeByte- 00:06:55.301 [2024-07-21 11:30:24.503357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.503383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.503447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.503461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.503516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.503529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.503583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.503595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.302 #17 NEW cov: 11692 ft: 14238 corp: 16/55b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ChangeByte- 00:06:55.302 [2024-07-21 11:30:24.543458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.543483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.543541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.543554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.543611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.543623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.543680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.543693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.302 #18 NEW cov: 11692 ft: 14249 corp: 17/59b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 CopyPart- 00:06:55.302 [2024-07-21 11:30:24.583599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.583625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.583685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.583699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.583756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.583769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.583826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.583839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.302 #19 NEW cov: 11692 ft: 14260 corp: 18/63b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:06:55.302 [2024-07-21 11:30:24.623669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.623695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.623755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.623768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.623825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.623838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.623894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.623907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.302 #20 NEW cov: 11692 ft: 14322 corp: 19/67b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:06:55.302 [2024-07-21 11:30:24.663846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.663871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.663947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.663961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.664021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.664035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.664095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.664108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.302 #21 NEW cov: 11692 ft: 14366 corp: 20/71b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:06:55.302 [2024-07-21 11:30:24.703943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.703967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.704044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.704058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.704117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.704131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.302 [2024-07-21 11:30:24.704187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.302 [2024-07-21 11:30:24.704200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.559 #22 NEW cov: 11692 ft: 14372 corp: 21/75b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:06:55.559 [2024-07-21 11:30:24.744214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.559 [2024-07-21 11:30:24.744239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.559 [2024-07-21 11:30:24.744297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.559 [2024-07-21 11:30:24.744311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.559 [2024-07-21 11:30:24.744369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.559 [2024-07-21 11:30:24.744383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.559 [2024-07-21 11:30:24.744439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.559 [2024-07-21 11:30:24.744456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.559 [2024-07-21 11:30:24.744512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.559 [2024-07-21 11:30:24.744525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.559 #23 NEW cov: 11692 ft: 14382 corp: 22/80b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:06:55.559 [2024-07-21 11:30:24.783895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.560 [2024-07-21 11:30:24.783920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.560 [2024-07-21 11:30:24.783980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.560 [2024-07-21 11:30:24.783994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.818 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:55.818 #24 NEW cov: 11715 ft: 14454 corp: 23/82b lim: 5 exec/s: 24 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:06:55.818 [2024-07-21 11:30:25.084959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.084992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.085046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.085060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.085115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.085127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.085199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.085212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.085266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.085279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.818 #25 NEW cov: 11715 ft: 14559 corp: 24/87b lim: 5 exec/s: 25 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:06:55.818 [2024-07-21 11:30:25.125004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.125030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.125085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.125099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.125151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.125164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.125217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.125230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.125284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.125296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.818 #26 NEW cov: 11715 ft: 14589 corp: 25/92b lim: 5 exec/s: 26 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:06:55.818 [2024-07-21 11:30:25.164657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.164686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.164740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.164754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.818 #27 NEW cov: 11715 ft: 14616 corp: 26/94b lim: 5 exec/s: 27 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:06:55.818 [2024-07-21 11:30:25.205068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.205094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.205165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.205178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.205232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.205245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.818 [2024-07-21 11:30:25.205299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.818 [2024-07-21 11:30:25.205312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.818 #28 NEW cov: 11715 ft: 14621 corp: 27/98b lim: 5 exec/s: 28 rss: 70Mb L: 4/5 MS: 1 EraseBytes- 00:06:56.077 [2024-07-21 11:30:25.245360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.245386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.245445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.245459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.245516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.245529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.245583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.245597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.245652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.245664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.077 #29 NEW cov: 11715 ft: 14637 corp: 28/103b lim: 5 exec/s: 29 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:06:56.077 [2024-07-21 11:30:25.285342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.285370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.285424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.285437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.285497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.285510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.285561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.285573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.077 #30 NEW cov: 11715 ft: 14653 corp: 29/107b lim: 5 exec/s: 30 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:06:56.077 [2024-07-21 11:30:25.325437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.325465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.325522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.325535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.077 [2024-07-21 11:30:25.325591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.077 [2024-07-21 11:30:25.325604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.325658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.325671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.078 #31 NEW cov: 11715 ft: 14664 corp: 30/111b lim: 5 exec/s: 31 rss: 70Mb L: 4/5 MS: 1 CopyPart- 00:06:56.078 [2024-07-21 11:30:25.365738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.365763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.365815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.365828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.365879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.365908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.365958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.365974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.366028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.366041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.078 #32 NEW cov: 11715 ft: 14678 corp: 31/116b lim: 5 exec/s: 32 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:06:56.078 [2024-07-21 11:30:25.405387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.405412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.405486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.405500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.078 #33 NEW cov: 11715 ft: 14713 corp: 32/118b lim: 5 exec/s: 33 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:06:56.078 [2024-07-21 11:30:25.445800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.445825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.445882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.445895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.445950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.445963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.078 [2024-07-21 11:30:25.446016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.446028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.078 #34 NEW cov: 11715 ft: 14724 corp: 33/122b lim: 5 exec/s: 34 rss: 70Mb L: 4/5 MS: 1 ChangeByte- 00:06:56.078 [2024-07-21 11:30:25.485454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.078 [2024-07-21 11:30:25.485496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 #35 NEW cov: 11715 ft: 14742 corp: 34/123b lim: 5 exec/s: 35 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:06:56.367 [2024-07-21 11:30:25.526178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.526204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.526255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.526268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.526323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.526336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.526388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.526401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.526453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.526467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.367 #36 NEW cov: 11715 ft: 14747 corp: 35/128b lim: 5 exec/s: 36 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:06:56.367 [2024-07-21 11:30:25.566290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.566315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.566371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.566384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.566437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.566454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.566508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.566520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.566571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.566584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.367 #37 NEW cov: 11715 ft: 14771 corp: 36/133b lim: 5 exec/s: 37 rss: 70Mb L: 5/5 MS: 1 ShuffleBytes- 00:06:56.367 [2024-07-21 11:30:25.605947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.605972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.606042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.606056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 #38 NEW cov: 11715 ft: 14822 corp: 37/135b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:06:56.367 [2024-07-21 11:30:25.646104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.646133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.646188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.646202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 #39 NEW cov: 11715 ft: 14828 corp: 38/137b lim: 5 exec/s: 39 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:06:56.367 [2024-07-21 11:30:25.686668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.686693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.686747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.686760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.686813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.686826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.686878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.686891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.686943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.686955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:56.367 #40 NEW cov: 11715 ft: 14853 corp: 39/142b lim: 5 exec/s: 40 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:06:56.367 [2024-07-21 11:30:25.726628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.726653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.726705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.726719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.726771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.726784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.726836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.726849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.367 #41 NEW cov: 11715 ft: 14872 corp: 40/146b lim: 5 exec/s: 41 rss: 70Mb L: 4/5 MS: 1 EraseBytes- 00:06:56.367 [2024-07-21 11:30:25.766470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.766500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.367 [2024-07-21 11:30:25.766554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.367 [2024-07-21 11:30:25.766567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.367 #42 NEW cov: 11715 ft: 14878 corp: 41/148b lim: 5 exec/s: 42 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:06:56.624 [2024-07-21 11:30:25.806868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.806893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.624 [2024-07-21 11:30:25.806962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.806976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.624 [2024-07-21 11:30:25.807028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.807041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.624 [2024-07-21 11:30:25.807094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.807107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.624 #43 NEW cov: 11715 ft: 14953 corp: 42/152b lim: 5 exec/s: 43 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:06:56.624 [2024-07-21 11:30:25.846560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.846585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.624 #44 NEW cov: 11715 ft: 15017 corp: 43/153b lim: 5 exec/s: 44 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:06:56.624 [2024-07-21 11:30:25.887032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.887057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.624 [2024-07-21 11:30:25.887112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.887125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.624 [2024-07-21 11:30:25.887177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.887189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.624 [2024-07-21 11:30:25.887242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.624 [2024-07-21 11:30:25.887254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.624 #45 NEW cov: 11715 ft: 15025 corp: 44/157b lim: 5 exec/s: 22 rss: 70Mb L: 4/5 MS: 1 ShuffleBytes- 00:06:56.624 #45 DONE cov: 11715 ft: 15025 corp: 44/157b lim: 5 exec/s: 22 rss: 70Mb 00:06:56.624 Done 45 runs in 2 second(s) 00:06:56.624 11:30:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:06:56.624 11:30:26 -- ../common.sh@72 -- # (( i++ )) 00:06:56.624 11:30:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:56.624 11:30:26 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:06:56.624 11:30:26 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:06:56.624 11:30:26 -- nvmf/run.sh@24 -- # local timen=1 00:06:56.624 11:30:26 -- nvmf/run.sh@25 -- # local core=0x1 00:06:56.624 11:30:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:06:56.624 11:30:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:06:56.624 11:30:26 -- nvmf/run.sh@29 -- # printf %02d 9 00:06:56.624 11:30:26 -- nvmf/run.sh@29 -- # port=4409 00:06:56.624 11:30:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:06:56.624 11:30:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:06:56.624 11:30:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:56.625 11:30:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:06:56.882 [2024-07-21 11:30:26.065512] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:56.882 [2024-07-21 11:30:26.065596] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2063264 ] 00:06:56.882 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.882 [2024-07-21 11:30:26.246290] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.882 [2024-07-21 11:30:26.265837] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:56.882 [2024-07-21 11:30:26.265979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.140 [2024-07-21 11:30:26.317505] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:57.140 [2024-07-21 11:30:26.333851] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:06:57.140 INFO: Running with entropic power schedule (0xFF, 100). 00:06:57.140 INFO: Seed: 2603962243 00:06:57.140 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:57.140 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:57.140 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:06:57.140 INFO: A corpus is not provided, starting from an empty corpus 00:06:57.140 [2024-07-21 11:30:26.379045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.379073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.140 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 66Mb 00:06:57.140 [2024-07-21 11:30:26.409177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.409203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.140 [2024-07-21 11:30:26.409259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.409273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.140 #3 NEW cov: 11601 ft: 12607 corp: 2/3b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:06:57.140 [2024-07-21 11:30:26.449130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.449155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.140 #4 NEW cov: 11607 ft: 12721 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:06:57.140 [2024-07-21 11:30:26.489255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.489279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.140 #5 NEW cov: 11692 ft: 13174 corp: 4/5b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeBinInt- 00:06:57.140 [2024-07-21 11:30:26.529505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.529531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.140 [2024-07-21 11:30:26.529587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.140 [2024-07-21 11:30:26.529601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.140 #6 NEW cov: 11692 ft: 13245 corp: 5/7b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeBit- 00:06:57.398 [2024-07-21 11:30:26.569651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.569677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.569750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.569764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.398 #7 NEW cov: 11692 ft: 13361 corp: 6/9b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeByte- 00:06:57.398 [2024-07-21 11:30:26.609752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.609778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.609836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.609850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.398 #8 NEW cov: 11692 ft: 13412 corp: 7/11b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:06:57.398 [2024-07-21 11:30:26.649713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.649738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 #9 NEW cov: 11692 ft: 13436 corp: 8/12b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:06:57.398 [2024-07-21 11:30:26.689974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.690000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.690059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.690073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.398 #10 NEW cov: 11692 ft: 13486 corp: 9/14b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ShuffleBytes- 00:06:57.398 [2024-07-21 11:30:26.730109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.730134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.730190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.730204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.398 #11 NEW cov: 11692 ft: 13561 corp: 10/16b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:06:57.398 [2024-07-21 11:30:26.770709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.770734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.770795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.770808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.770864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.770877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.770933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.770947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.771000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.771014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.398 #12 NEW cov: 11692 ft: 13925 corp: 11/21b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CMP- DE: "\377\377\377\003"- 00:06:57.398 [2024-07-21 11:30:26.810357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.810383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.398 [2024-07-21 11:30:26.810440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.398 [2024-07-21 11:30:26.810458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.657 #13 NEW cov: 11692 ft: 13946 corp: 12/23b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeBinInt- 00:06:57.657 [2024-07-21 11:30:26.850303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:26.850331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.657 #14 NEW cov: 11692 ft: 13973 corp: 13/24b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:06:57.657 [2024-07-21 11:30:26.890722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:26.890747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.657 [2024-07-21 11:30:26.890806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:26.890819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.657 [2024-07-21 11:30:26.890876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:26.890890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.657 #15 NEW cov: 11692 ft: 14153 corp: 14/27b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:06:57.657 [2024-07-21 11:30:26.930480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:26.930505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.657 #16 NEW cov: 11692 ft: 14238 corp: 15/28b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 EraseBytes- 00:06:57.657 [2024-07-21 11:30:26.970661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:26.970686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.657 #17 NEW cov: 11692 ft: 14256 corp: 16/29b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CopyPart- 00:06:57.657 [2024-07-21 11:30:27.000878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:27.000902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.657 [2024-07-21 11:30:27.000959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:27.000973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.657 #18 NEW cov: 11692 ft: 14311 corp: 17/31b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CMP- DE: "\000\010"- 00:06:57.657 [2024-07-21 11:30:27.040951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:27.040976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.657 [2024-07-21 11:30:27.041033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.657 [2024-07-21 11:30:27.041047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.657 #19 NEW cov: 11692 ft: 14348 corp: 18/33b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:06:57.915 [2024-07-21 11:30:27.081122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.081150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.081209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.081223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.915 #20 NEW cov: 11692 ft: 14451 corp: 19/35b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:06:57.915 [2024-07-21 11:30:27.121326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.121352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.121411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.121425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.121479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.121508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.915 #21 NEW cov: 11692 ft: 14455 corp: 20/38b lim: 5 exec/s: 0 rss: 69Mb L: 3/5 MS: 1 ChangeBit- 00:06:57.915 [2024-07-21 11:30:27.161461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.161487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.161541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.161554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.161609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.161623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.915 #22 NEW cov: 11692 ft: 14463 corp: 21/41b lim: 5 exec/s: 0 rss: 69Mb L: 3/5 MS: 1 CopyPart- 00:06:57.915 [2024-07-21 11:30:27.201481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.201506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.201563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.201576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.915 #23 NEW cov: 11692 ft: 14469 corp: 22/43b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:06:57.915 [2024-07-21 11:30:27.241595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.241620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.915 [2024-07-21 11:30:27.241677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.241690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.915 #24 NEW cov: 11692 ft: 14480 corp: 23/45b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:06:57.915 [2024-07-21 11:30:27.281576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.915 [2024-07-21 11:30:27.281601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.174 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:58.174 #25 NEW cov: 11715 ft: 14515 corp: 24/46b lim: 5 exec/s: 25 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:06:58.174 [2024-07-21 11:30:27.582784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.174 [2024-07-21 11:30:27.582821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.174 [2024-07-21 11:30:27.582889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.174 [2024-07-21 11:30:27.582905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.174 [2024-07-21 11:30:27.582964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.174 [2024-07-21 11:30:27.582980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.432 #26 NEW cov: 11715 ft: 14535 corp: 25/49b lim: 5 exec/s: 26 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:06:58.432 [2024-07-21 11:30:27.622435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.622466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.432 #27 NEW cov: 11715 ft: 14545 corp: 26/50b lim: 5 exec/s: 27 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:06:58.432 [2024-07-21 11:30:27.662726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.662752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.432 [2024-07-21 11:30:27.662810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.662824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.432 #28 NEW cov: 11715 ft: 14607 corp: 27/52b lim: 5 exec/s: 28 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:06:58.432 [2024-07-21 11:30:27.702993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.703019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.432 [2024-07-21 11:30:27.703078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.703095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.432 [2024-07-21 11:30:27.703153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.703166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.432 #29 NEW cov: 11715 ft: 14622 corp: 28/55b lim: 5 exec/s: 29 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:06:58.432 [2024-07-21 11:30:27.742973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.432 [2024-07-21 11:30:27.742999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.433 [2024-07-21 11:30:27.743056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.743069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.433 #30 NEW cov: 11715 ft: 14647 corp: 29/57b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:06:58.433 [2024-07-21 11:30:27.783522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.783549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.433 [2024-07-21 11:30:27.783609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.783623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.433 [2024-07-21 11:30:27.783675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.783689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.433 [2024-07-21 11:30:27.783744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.783757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.433 [2024-07-21 11:30:27.783813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.783826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:58.433 #31 NEW cov: 11715 ft: 14660 corp: 30/62b lim: 5 exec/s: 31 rss: 70Mb L: 5/5 MS: 1 PersAutoDict- DE: "\377\377\377\003"- 00:06:58.433 [2024-07-21 11:30:27.823210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.823236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.433 [2024-07-21 11:30:27.823293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.433 [2024-07-21 11:30:27.823306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.433 #32 NEW cov: 11715 ft: 14698 corp: 31/64b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 PersAutoDict- DE: "\000\010"- 00:06:58.691 [2024-07-21 11:30:27.863325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:27.863350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 [2024-07-21 11:30:27.863423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:27.863437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.691 #33 NEW cov: 11715 ft: 14729 corp: 32/66b lim: 5 exec/s: 33 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:06:58.691 [2024-07-21 11:30:27.903295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:27.903320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 #34 NEW cov: 11715 ft: 14754 corp: 33/67b lim: 5 exec/s: 34 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:06:58.691 [2024-07-21 11:30:27.943422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:27.943451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 #35 NEW cov: 11715 ft: 14765 corp: 34/68b lim: 5 exec/s: 35 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:06:58.691 [2024-07-21 11:30:27.983750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:27.983775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 [2024-07-21 11:30:27.983832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:27.983846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.691 #36 NEW cov: 11715 ft: 14787 corp: 35/70b lim: 5 exec/s: 36 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:06:58.691 [2024-07-21 11:30:28.023787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:28.023813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 [2024-07-21 11:30:28.023885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:28.023899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.691 #37 NEW cov: 11715 ft: 14809 corp: 36/72b lim: 5 exec/s: 37 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:06:58.691 [2024-07-21 11:30:28.063732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:28.063757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 #38 NEW cov: 11715 ft: 14811 corp: 37/73b lim: 5 exec/s: 38 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:06:58.691 [2024-07-21 11:30:28.094008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:28.094034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.691 [2024-07-21 11:30:28.094083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.691 [2024-07-21 11:30:28.094097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.950 #39 NEW cov: 11715 ft: 14852 corp: 38/75b lim: 5 exec/s: 39 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:06:58.950 [2024-07-21 11:30:28.134165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.134190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.134237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.134251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.950 #40 NEW cov: 11715 ft: 14882 corp: 39/77b lim: 5 exec/s: 40 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:06:58.950 [2024-07-21 11:30:28.174246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.174273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.174331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.174344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.950 #41 NEW cov: 11715 ft: 14886 corp: 40/79b lim: 5 exec/s: 41 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:06:58.950 [2024-07-21 11:30:28.214243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.214268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.950 #42 NEW cov: 11715 ft: 14889 corp: 41/80b lim: 5 exec/s: 42 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:06:58.950 [2024-07-21 11:30:28.254833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.254859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.254915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.254929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.254982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.254996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.255048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.255061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.950 #43 NEW cov: 11715 ft: 14902 corp: 42/84b lim: 5 exec/s: 43 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:06:58.950 [2024-07-21 11:30:28.295115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.295142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.295196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.295210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.295263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.295276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.950 [2024-07-21 11:30:28.295332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.950 [2024-07-21 11:30:28.295345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.951 [2024-07-21 11:30:28.295399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.951 [2024-07-21 11:30:28.295413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:58.951 #44 NEW cov: 11715 ft: 15001 corp: 43/89b lim: 5 exec/s: 44 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:06:58.951 [2024-07-21 11:30:28.334664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.951 [2024-07-21 11:30:28.334688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.951 [2024-07-21 11:30:28.334745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.951 [2024-07-21 11:30:28.334759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.951 #45 NEW cov: 11715 ft: 15003 corp: 44/91b lim: 5 exec/s: 45 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:06:59.210 [2024-07-21 11:30:28.374841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.210 [2024-07-21 11:30:28.374867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.210 [2024-07-21 11:30:28.374923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.210 [2024-07-21 11:30:28.374937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.210 #46 NEW cov: 11715 ft: 15009 corp: 45/93b lim: 5 exec/s: 23 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:06:59.210 #46 DONE cov: 11715 ft: 15009 corp: 45/93b lim: 5 exec/s: 23 rss: 70Mb 00:06:59.210 ###### Recommended dictionary. ###### 00:06:59.210 "\377\377\377\003" # Uses: 1 00:06:59.210 "\000\010" # Uses: 1 00:06:59.210 ###### End of recommended dictionary. ###### 00:06:59.210 Done 46 runs in 2 second(s) 00:06:59.210 11:30:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:06:59.210 11:30:28 -- ../common.sh@72 -- # (( i++ )) 00:06:59.210 11:30:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:59.210 11:30:28 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:06:59.210 11:30:28 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:06:59.210 11:30:28 -- nvmf/run.sh@24 -- # local timen=1 00:06:59.210 11:30:28 -- nvmf/run.sh@25 -- # local core=0x1 00:06:59.210 11:30:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:06:59.210 11:30:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:06:59.210 11:30:28 -- nvmf/run.sh@29 -- # printf %02d 10 00:06:59.210 11:30:28 -- nvmf/run.sh@29 -- # port=4410 00:06:59.210 11:30:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:06:59.210 11:30:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:06:59.210 11:30:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:59.210 11:30:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:06:59.210 [2024-07-21 11:30:28.546327] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:59.210 [2024-07-21 11:30:28.546395] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2063674 ] 00:06:59.210 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.469 [2024-07-21 11:30:28.725973] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.469 [2024-07-21 11:30:28.745280] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:59.469 [2024-07-21 11:30:28.745404] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.469 [2024-07-21 11:30:28.796933] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:59.469 [2024-07-21 11:30:28.813285] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:06:59.469 INFO: Running with entropic power schedule (0xFF, 100). 00:06:59.469 INFO: Seed: 788996005 00:06:59.469 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:06:59.469 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:06:59.470 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:06:59.470 INFO: A corpus is not provided, starting from an empty corpus 00:06:59.470 #2 INITED exec/s: 0 rss: 60Mb 00:06:59.470 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:59.470 This may also happen if the target rejected all inputs we tried so far 00:06:59.470 [2024-07-21 11:30:28.858785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.470 [2024-07-21 11:30:28.858815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.470 [2024-07-21 11:30:28.858889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.470 [2024-07-21 11:30:28.858903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.470 [2024-07-21 11:30:28.858962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.470 [2024-07-21 11:30:28.858975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.470 [2024-07-21 11:30:28.859031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:59.470 [2024-07-21 11:30:28.859045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.038 NEW_FUNC[1/670]: 0x4ab4e0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:00.038 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:00.038 #11 NEW cov: 11511 ft: 11512 corp: 2/34b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 4 InsertRepeatedBytes-CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:00.038 [2024-07-21 11:30:29.179359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.179393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.179458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.179472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 #12 NEW cov: 11624 ft: 12546 corp: 3/57b lim: 40 exec/s: 0 rss: 68Mb L: 23/33 MS: 1 EraseBytes- 00:07:00.038 [2024-07-21 11:30:29.229528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:1111ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.229556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.229615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.229628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.229686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.229699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.038 #13 NEW cov: 11630 ft: 12931 corp: 4/84b lim: 40 exec/s: 0 rss: 68Mb L: 27/33 MS: 1 EraseBytes- 00:07:00.038 [2024-07-21 11:30:29.269803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.269829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.269905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.269921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.269980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.269994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.270055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.270068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.038 #14 NEW cov: 11715 ft: 13164 corp: 5/117b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:07:00.038 [2024-07-21 11:30:29.309756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:1111ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.309782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.309862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.309876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.309934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.309948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.038 #15 NEW cov: 11715 ft: 13240 corp: 6/144b lim: 40 exec/s: 0 rss: 68Mb L: 27/33 MS: 1 ChangeBit- 00:07:00.038 [2024-07-21 11:30:29.349997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.350022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.350082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.350095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.350142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.350156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.350214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.350227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.038 #16 NEW cov: 11715 ft: 13300 corp: 7/177b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeByte- 00:07:00.038 [2024-07-21 11:30:29.379842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.379868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.379927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.379941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 #17 NEW cov: 11715 ft: 13417 corp: 8/200b lim: 40 exec/s: 0 rss: 68Mb L: 23/33 MS: 1 ChangeByte- 00:07:00.038 [2024-07-21 11:30:29.420202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.420228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.420287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.420300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.420355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.420371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.420427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.420446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.038 #18 NEW cov: 11715 ft: 13464 corp: 9/233b lim: 40 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:00.038 [2024-07-21 11:30:29.460472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.460498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.460556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.460570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.460624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.460638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.460695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.460708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.038 [2024-07-21 11:30:29.460764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.038 [2024-07-21 11:30:29.460778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:00.297 #19 NEW cov: 11715 ft: 13520 corp: 10/273b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:00.297 [2024-07-21 11:30:29.500173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.297 [2024-07-21 11:30:29.500198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.500255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.500268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.298 #20 NEW cov: 11715 ft: 13585 corp: 11/296b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 CrossOver- 00:07:00.298 [2024-07-21 11:30:29.540183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:11111111 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.540208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.298 #21 NEW cov: 11715 ft: 13981 corp: 12/310b lim: 40 exec/s: 0 rss: 69Mb L: 14/40 MS: 1 CrossOver- 00:07:00.298 [2024-07-21 11:30:29.580437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:111160ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.580468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.580528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.580544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.298 #22 NEW cov: 11715 ft: 13995 corp: 13/333b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 ChangeByte- 00:07:00.298 [2024-07-21 11:30:29.620580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11110be1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.620605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.620665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.620678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.298 #23 NEW cov: 11715 ft: 14028 corp: 14/356b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 ChangeBinInt- 00:07:00.298 [2024-07-21 11:30:29.660667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.660692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.660752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.660766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.298 #24 NEW cov: 11715 ft: 14060 corp: 15/379b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 CrossOver- 00:07:00.298 [2024-07-21 11:30:29.701047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.701073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.701132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.701146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.701201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffefff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.701214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.298 [2024-07-21 11:30:29.701269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.298 [2024-07-21 11:30:29.701282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.298 #25 NEW cov: 11715 ft: 14086 corp: 16/412b lim: 40 exec/s: 0 rss: 69Mb L: 33/40 MS: 1 ChangeBit- 00:07:00.557 [2024-07-21 11:30:29.741183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.557 [2024-07-21 11:30:29.741208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.557 [2024-07-21 11:30:29.741266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.557 [2024-07-21 11:30:29.741280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.557 [2024-07-21 11:30:29.741353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffefff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.557 [2024-07-21 11:30:29.741366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.557 [2024-07-21 11:30:29.741423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.557 [2024-07-21 11:30:29.741436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.557 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:00.557 #26 NEW cov: 11738 ft: 14128 corp: 17/445b lim: 40 exec/s: 0 rss: 69Mb L: 33/40 MS: 1 CrossOver- 00:07:00.557 [2024-07-21 11:30:29.781290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.557 [2024-07-21 11:30:29.781316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.557 [2024-07-21 11:30:29.781376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.781390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.781447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ff111111 cdw11:e11111ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.781460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.781518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.781531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.558 #27 NEW cov: 11738 ft: 14132 corp: 18/484b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 CopyPart- 00:07:00.558 [2024-07-21 11:30:29.821044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:11111111 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.821069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.558 #28 NEW cov: 11738 ft: 14271 corp: 19/498b lim: 40 exec/s: 28 rss: 70Mb L: 14/40 MS: 1 ChangeBit- 00:07:00.558 [2024-07-21 11:30:29.861411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.861436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.861524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff5eff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.861538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.861597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.861610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.558 #29 NEW cov: 11738 ft: 14293 corp: 20/522b lim: 40 exec/s: 29 rss: 70Mb L: 24/40 MS: 1 InsertByte- 00:07:00.558 [2024-07-21 11:30:29.901446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.901471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.901530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff0d00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.901543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.558 #30 NEW cov: 11738 ft: 14313 corp: 21/545b lim: 40 exec/s: 30 rss: 70Mb L: 23/40 MS: 1 CMP- DE: "\015\000"- 00:07:00.558 [2024-07-21 11:30:29.931621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:1111ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.931646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.931705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.931719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.931779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.931791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.558 #31 NEW cov: 11738 ft: 14353 corp: 22/572b lim: 40 exec/s: 31 rss: 70Mb L: 27/40 MS: 1 ShuffleBytes- 00:07:00.558 [2024-07-21 11:30:29.971879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.971906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.971963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.971977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.972032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.972045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.558 [2024-07-21 11:30:29.972100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.558 [2024-07-21 11:30:29.972113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.818 #32 NEW cov: 11738 ft: 14372 corp: 23/608b lim: 40 exec/s: 32 rss: 70Mb L: 36/40 MS: 1 CrossOver- 00:07:00.818 [2024-07-21 11:30:30.012102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.012127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.012187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.012200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.012263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a3a3a3a3 cdw11:a3a3a3a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.012276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.012335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:a3a3a3a3 cdw11:a3a3a3a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.012349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.012407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:a3ffffff cdw11:ffffff2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.012421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:00.818 #33 NEW cov: 11738 ft: 14377 corp: 24/648b lim: 40 exec/s: 33 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:00.818 [2024-07-21 11:30:30.052134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.052161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.052221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.052235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.052291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0d00ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.052305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.052364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.052378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.818 #34 NEW cov: 11738 ft: 14392 corp: 25/681b lim: 40 exec/s: 34 rss: 70Mb L: 33/40 MS: 1 PersAutoDict- DE: "\015\000"- 00:07:00.818 [2024-07-21 11:30:30.092037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11110000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.092062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.092124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.092138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.092200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000011 cdw11:40111111 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.092214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.818 #36 NEW cov: 11738 ft: 14410 corp: 26/705b lim: 40 exec/s: 36 rss: 70Mb L: 24/40 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:00.818 [2024-07-21 11:30:30.132115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111114e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.132143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.132203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff0d00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.132217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.818 #37 NEW cov: 11738 ft: 14504 corp: 27/728b lim: 40 exec/s: 37 rss: 70Mb L: 23/40 MS: 1 ChangeBinInt- 00:07:00.818 [2024-07-21 11:30:30.172412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.818 [2024-07-21 11:30:30.172437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.818 [2024-07-21 11:30:30.172499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.819 [2024-07-21 11:30:30.172513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.819 [2024-07-21 11:30:30.172571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.819 [2024-07-21 11:30:30.172584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.819 [2024-07-21 11:30:30.172641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.819 [2024-07-21 11:30:30.172654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.819 #38 NEW cov: 11738 ft: 14528 corp: 28/761b lim: 40 exec/s: 38 rss: 70Mb L: 33/40 MS: 1 CrossOver- 00:07:00.819 [2024-07-21 11:30:30.212392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.819 [2024-07-21 11:30:30.212418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.819 [2024-07-21 11:30:30.212484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:32ff5eff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.819 [2024-07-21 11:30:30.212498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.819 [2024-07-21 11:30:30.212557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.819 [2024-07-21 11:30:30.212570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.819 #39 NEW cov: 11738 ft: 14573 corp: 29/785b lim: 40 exec/s: 39 rss: 70Mb L: 24/40 MS: 1 ChangeByte- 00:07:01.079 [2024-07-21 11:30:30.252626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.252650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.252710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.252723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.252783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.252799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.252857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff11 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.252870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.079 #40 NEW cov: 11738 ft: 14593 corp: 30/817b lim: 40 exec/s: 40 rss: 70Mb L: 32/40 MS: 1 EraseBytes- 00:07:01.079 [2024-07-21 11:30:30.282506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:115111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.282530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.282581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff0d00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.282595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.079 #41 NEW cov: 11738 ft: 14607 corp: 31/840b lim: 40 exec/s: 41 rss: 70Mb L: 23/40 MS: 1 ChangeBit- 00:07:01.079 [2024-07-21 11:30:30.322644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11511111 cdw11:e111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.322669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.322727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff0d00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.322740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.079 #42 NEW cov: 11738 ft: 14659 corp: 32/863b lim: 40 exec/s: 42 rss: 70Mb L: 23/40 MS: 1 ShuffleBytes- 00:07:01.079 [2024-07-21 11:30:30.362756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.362781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.362842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.362855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.079 #43 NEW cov: 11738 ft: 14660 corp: 33/880b lim: 40 exec/s: 43 rss: 70Mb L: 17/40 MS: 1 EraseBytes- 00:07:01.079 [2024-07-21 11:30:30.402935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11110000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.402960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.403006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00006000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.403019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.403075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000011 cdw11:40111111 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.403088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.079 #44 NEW cov: 11738 ft: 14716 corp: 34/904b lim: 40 exec/s: 44 rss: 70Mb L: 24/40 MS: 1 ChangeByte- 00:07:01.079 [2024-07-21 11:30:30.442864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:11111111 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.442891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 #45 NEW cov: 11738 ft: 14732 corp: 35/918b lim: 40 exec/s: 45 rss: 70Mb L: 14/40 MS: 1 CrossOver- 00:07:01.079 [2024-07-21 11:30:30.483120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.483146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.079 [2024-07-21 11:30:30.483220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff2e8f cdw11:756359e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.079 [2024-07-21 11:30:30.483234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.339 #46 NEW cov: 11738 ft: 14779 corp: 36/935b lim: 40 exec/s: 46 rss: 70Mb L: 17/40 MS: 1 CMP- DE: "\377.\217ucY\347,"- 00:07:01.339 [2024-07-21 11:30:30.523497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.339 [2024-07-21 11:30:30.523523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.339 [2024-07-21 11:30:30.523582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.339 [2024-07-21 11:30:30.523595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.339 [2024-07-21 11:30:30.523650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.339 [2024-07-21 11:30:30.523663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.339 [2024-07-21 11:30:30.523719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.339 [2024-07-21 11:30:30.523732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.339 #47 NEW cov: 11738 ft: 14787 corp: 37/968b lim: 40 exec/s: 47 rss: 70Mb L: 33/40 MS: 1 ShuffleBytes- 00:07:01.340 [2024-07-21 11:30:30.563759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:111111e1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.563785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.563842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1111ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.563856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.563933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:a3a3a3a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.563947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.564005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:a3a3a3a3 cdw11:a3a3a3a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.564021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.564080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:a3a3a3a3 cdw11:a3ffff2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.564093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:01.340 #48 NEW cov: 11738 ft: 14805 corp: 38/1008b lim: 40 exec/s: 48 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:01.340 [2024-07-21 11:30:30.603846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.603872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.603931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.603945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.604002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.604015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.604076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:0d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.604088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.604147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.604159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:01.340 #49 NEW cov: 11738 ft: 14810 corp: 39/1048b lim: 40 exec/s: 49 rss: 70Mb L: 40/40 MS: 1 PersAutoDict- DE: "\015\000"- 00:07:01.340 [2024-07-21 11:30:30.643968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1119ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.643993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.644050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.644063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.644122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.644134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.644190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:0d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.644203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.644258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.644274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:01.340 #50 NEW cov: 11738 ft: 14839 corp: 40/1088b lim: 40 exec/s: 50 rss: 70Mb L: 40/40 MS: 1 ChangeBit- 00:07:01.340 [2024-07-21 11:30:30.684104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.684130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.684189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.684202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.684259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.684272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.684329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffbb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.684342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.684401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:11111140 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.684414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:01.340 #51 NEW cov: 11738 ft: 14841 corp: 41/1128b lim: 40 exec/s: 51 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:01.340 [2024-07-21 11:30:30.723812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:11111140 cdw11:11111111 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.723837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.340 [2024-07-21 11:30:30.723894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:11ff2e8f cdw11:756359e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.340 [2024-07-21 11:30:30.723908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.340 #52 NEW cov: 11738 ft: 14848 corp: 42/1150b lim: 40 exec/s: 52 rss: 70Mb L: 22/40 MS: 1 PersAutoDict- DE: "\377.\217ucY\347,"- 00:07:01.601 [2024-07-21 11:30:30.764214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.764241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.601 [2024-07-21 11:30:30.764299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.764313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.601 [2024-07-21 11:30:30.764369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ff111111 cdw11:e11111ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.764382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.601 [2024-07-21 11:30:30.764426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:04000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.764447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.601 #53 NEW cov: 11738 ft: 14852 corp: 43/1189b lim: 40 exec/s: 53 rss: 70Mb L: 39/40 MS: 1 ChangeBinInt- 00:07:01.601 [2024-07-21 11:30:30.804050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:111111e1 cdw11:112911ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.804075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.601 [2024-07-21 11:30:30.804134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff2e cdw11:8f756359 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.804148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.601 #54 NEW cov: 11738 ft: 14858 corp: 44/1207b lim: 40 exec/s: 54 rss: 70Mb L: 18/40 MS: 1 InsertByte- 00:07:01.601 [2024-07-21 11:30:30.844287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:1151ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.844312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.601 [2024-07-21 11:30:30.844390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff11e1 cdw11:1111ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.844404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.601 [2024-07-21 11:30:30.844466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ff0d00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.601 [2024-07-21 11:30:30.844480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.601 #55 NEW cov: 11738 ft: 14862 corp: 45/1238b lim: 40 exec/s: 27 rss: 71Mb L: 31/40 MS: 1 CopyPart- 00:07:01.601 #55 DONE cov: 11738 ft: 14862 corp: 45/1238b lim: 40 exec/s: 27 rss: 71Mb 00:07:01.601 ###### Recommended dictionary. ###### 00:07:01.601 "\015\000" # Uses: 2 00:07:01.601 "\377.\217ucY\347," # Uses: 1 00:07:01.601 ###### End of recommended dictionary. ###### 00:07:01.601 Done 55 runs in 2 second(s) 00:07:01.601 11:30:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:01.601 11:30:30 -- ../common.sh@72 -- # (( i++ )) 00:07:01.601 11:30:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:01.601 11:30:30 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:01.601 11:30:30 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:01.601 11:30:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:01.601 11:30:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:01.601 11:30:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:01.601 11:30:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:01.601 11:30:30 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:01.601 11:30:30 -- nvmf/run.sh@29 -- # port=4411 00:07:01.601 11:30:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:01.601 11:30:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:01.601 11:30:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:01.601 11:30:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:01.601 [2024-07-21 11:30:31.014781] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:01.601 [2024-07-21 11:30:31.014851] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2064211 ] 00:07:01.860 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.860 [2024-07-21 11:30:31.189875] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.860 [2024-07-21 11:30:31.209980] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:01.860 [2024-07-21 11:30:31.210122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.860 [2024-07-21 11:30:31.261888] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:01.860 [2024-07-21 11:30:31.278240] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:02.119 INFO: Running with entropic power schedule (0xFF, 100). 00:07:02.119 INFO: Seed: 3252995401 00:07:02.119 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:02.119 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:02.119 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:02.119 INFO: A corpus is not provided, starting from an empty corpus 00:07:02.119 #2 INITED exec/s: 0 rss: 60Mb 00:07:02.119 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:02.119 This may also happen if the target rejected all inputs we tried so far 00:07:02.119 [2024-07-21 11:30:31.323552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.119 [2024-07-21 11:30:31.323581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.119 [2024-07-21 11:30:31.323639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.119 [2024-07-21 11:30:31.323653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.379 NEW_FUNC[1/671]: 0x4ad250 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:02.379 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:02.379 #19 NEW cov: 11521 ft: 11522 corp: 2/19b lim: 40 exec/s: 0 rss: 68Mb L: 18/18 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:02.379 [2024-07-21 11:30:31.634698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.634729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.634789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.634803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.634860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.634889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.634947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.634960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.379 #21 NEW cov: 11636 ft: 12421 corp: 3/54b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:02.379 [2024-07-21 11:30:31.674334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.674360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.674421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:31000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.674434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.379 #22 NEW cov: 11642 ft: 12573 corp: 4/73b lim: 40 exec/s: 0 rss: 68Mb L: 19/35 MS: 1 InsertByte- 00:07:02.379 [2024-07-21 11:30:31.714796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.714821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.714880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.714894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.714950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.714964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.715019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.715032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.379 #27 NEW cov: 11727 ft: 12785 corp: 5/108b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 5 ChangeBit-CopyPart-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:02.379 [2024-07-21 11:30:31.754775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.754801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.754861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.754875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.754932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.754946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.379 #28 NEW cov: 11727 ft: 13106 corp: 6/132b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:07:02.379 [2024-07-21 11:30:31.794891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.794916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.794975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.794992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.379 [2024-07-21 11:30:31.795050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.379 [2024-07-21 11:30:31.795063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.638 #29 NEW cov: 11727 ft: 13176 corp: 7/156b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 ShuffleBytes- 00:07:02.638 [2024-07-21 11:30:31.834999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.835025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.835085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:31373737 cdw11:37370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.835099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.835158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.835171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.638 #30 NEW cov: 11727 ft: 13304 corp: 8/180b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:07:02.638 [2024-07-21 11:30:31.875110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.875136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.875192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.875206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.875264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.875277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.638 #31 NEW cov: 11727 ft: 13340 corp: 9/204b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:07:02.638 [2024-07-21 11:30:31.915383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.915408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.915468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.915482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.915539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.915552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.915611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.915642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.638 #32 NEW cov: 11727 ft: 13380 corp: 10/239b lim: 40 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:02.638 [2024-07-21 11:30:31.955677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.955704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.955763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.638 [2024-07-21 11:30:31.955777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.638 [2024-07-21 11:30:31.955833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:31.955846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.639 [2024-07-21 11:30:31.955904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:31.955917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.639 [2024-07-21 11:30:31.955975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:31.955988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:02.639 #33 NEW cov: 11727 ft: 13481 corp: 11/279b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:02.639 [2024-07-21 11:30:31.995327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:31.995353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.639 [2024-07-21 11:30:31.995431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:30000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:31.995451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.639 #34 NEW cov: 11727 ft: 13542 corp: 12/298b lim: 40 exec/s: 0 rss: 69Mb L: 19/40 MS: 1 ChangeBinInt- 00:07:02.639 [2024-07-21 11:30:32.035406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:16000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:32.035433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.639 [2024-07-21 11:30:32.035510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00310000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.639 [2024-07-21 11:30:32.035525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.639 #40 NEW cov: 11727 ft: 13615 corp: 13/318b lim: 40 exec/s: 0 rss: 69Mb L: 20/40 MS: 1 InsertByte- 00:07:02.898 [2024-07-21 11:30:32.076046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.076072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.076132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.076149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.076206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.076219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.076279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.076292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.076350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.076363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:02.898 #41 NEW cov: 11727 ft: 13633 corp: 14/358b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:02.898 [2024-07-21 11:30:32.115973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.115999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.116060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.116074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.116132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.116146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.116185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.116198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.898 #42 NEW cov: 11727 ft: 13641 corp: 15/397b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 CopyPart- 00:07:02.898 [2024-07-21 11:30:32.156183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.156209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.156270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.156283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.156331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.156345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.156400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.156416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.156465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.156478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:02.898 #43 NEW cov: 11727 ft: 13721 corp: 16/437b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:02.898 [2024-07-21 11:30:32.196137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.898 [2024-07-21 11:30:32.196162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.898 [2024-07-21 11:30:32.196220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:30000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.196234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.196308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.196322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.196379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.196392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.899 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:02.899 #44 NEW cov: 11750 ft: 13728 corp: 17/472b lim: 40 exec/s: 0 rss: 69Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:07:02.899 [2024-07-21 11:30:32.236425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.236457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.236520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.236534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.236590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.236603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.236660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.236673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.236731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.236744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:02.899 #45 NEW cov: 11750 ft: 13744 corp: 18/512b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:02.899 [2024-07-21 11:30:32.276625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.276652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.276714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.276728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.276784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.276797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.276857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.276871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.276931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.276945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:02.899 #46 NEW cov: 11750 ft: 13759 corp: 19/552b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:02.899 [2024-07-21 11:30:32.316549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.316576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.316634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:30000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.316647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.316707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.316721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.899 [2024-07-21 11:30:32.316778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0000f9ff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.899 [2024-07-21 11:30:32.316791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.158 #47 NEW cov: 11750 ft: 13790 corp: 20/587b lim: 40 exec/s: 47 rss: 69Mb L: 35/40 MS: 1 ChangeBinInt- 00:07:03.158 [2024-07-21 11:30:32.356497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.158 [2024-07-21 11:30:32.356524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.158 [2024-07-21 11:30:32.356583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.158 [2024-07-21 11:30:32.356597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.158 [2024-07-21 11:30:32.356659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.158 [2024-07-21 11:30:32.356675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.158 #48 NEW cov: 11750 ft: 13819 corp: 21/613b lim: 40 exec/s: 48 rss: 69Mb L: 26/40 MS: 1 EraseBytes- 00:07:03.158 [2024-07-21 11:30:32.396629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a4242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.158 [2024-07-21 11:30:32.396655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.158 [2024-07-21 11:30:32.396713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.158 [2024-07-21 11:30:32.396727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.158 [2024-07-21 11:30:32.396785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.158 [2024-07-21 11:30:32.396798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.158 #51 NEW cov: 11750 ft: 13847 corp: 22/638b lim: 40 exec/s: 51 rss: 69Mb L: 25/40 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:03.159 [2024-07-21 11:30:32.427058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.427083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.427160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.427174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.427232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.427246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.427305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.427318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.427377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.427391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.159 #52 NEW cov: 11750 ft: 13873 corp: 23/678b lim: 40 exec/s: 52 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:03.159 [2024-07-21 11:30:32.467117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.467142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.467199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.467213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.467288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.467305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.467364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.467377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.467436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffff5555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.467453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.159 #53 NEW cov: 11750 ft: 13878 corp: 24/718b lim: 40 exec/s: 53 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:03.159 [2024-07-21 11:30:32.507257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.507283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.507339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.507353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.507411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.507425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.507486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.507500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.507557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.507570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.159 #54 NEW cov: 11750 ft: 13905 corp: 25/758b lim: 40 exec/s: 54 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:03.159 [2024-07-21 11:30:32.547050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.547076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.547135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:31373737 cdw11:37370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.547148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.159 [2024-07-21 11:30:32.547207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.159 [2024-07-21 11:30:32.547220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.159 #55 NEW cov: 11750 ft: 13932 corp: 26/782b lim: 40 exec/s: 55 rss: 70Mb L: 24/40 MS: 1 ChangeBit- 00:07:03.418 [2024-07-21 11:30:32.587298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.587326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.587387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ecececec SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.587400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.587461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ecececec cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.587475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.587532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.587545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.418 #56 NEW cov: 11750 ft: 13948 corp: 27/814b lim: 40 exec/s: 56 rss: 70Mb L: 32/40 MS: 1 InsertRepeatedBytes- 00:07:03.418 [2024-07-21 11:30:32.627436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.627465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.627542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ecececec SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.627556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.627614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ecececec cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.627628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.627686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.627700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.418 #57 NEW cov: 11750 ft: 13963 corp: 28/846b lim: 40 exec/s: 57 rss: 70Mb L: 32/40 MS: 1 ShuffleBytes- 00:07:03.418 [2024-07-21 11:30:32.667715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.667741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.667800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.667813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.667869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff61ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.667882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.667939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.667955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.668013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.668026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.418 #58 NEW cov: 11750 ft: 13984 corp: 29/886b lim: 40 exec/s: 58 rss: 70Mb L: 40/40 MS: 1 InsertByte- 00:07:03.418 [2024-07-21 11:30:32.707853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.418 [2024-07-21 11:30:32.707879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.418 [2024-07-21 11:30:32.707935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.707948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.708003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.708017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.708072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.708085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.708143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.708156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.419 #59 NEW cov: 11750 ft: 14047 corp: 30/926b lim: 40 exec/s: 59 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:03.419 [2024-07-21 11:30:32.747655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.747680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.747739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.747752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.747827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.747841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.419 #60 NEW cov: 11750 ft: 14053 corp: 31/956b lim: 40 exec/s: 60 rss: 70Mb L: 30/40 MS: 1 EraseBytes- 00:07:03.419 [2024-07-21 11:30:32.788095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.788120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.788178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.788194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.788257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.788271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.788330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000065 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.788344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.788401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.788415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.419 #61 NEW cov: 11750 ft: 14069 corp: 32/996b lim: 40 exec/s: 61 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:03.419 [2024-07-21 11:30:32.828242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.828268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.828343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.828357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.828419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.828432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.828496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:4b555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.828510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.419 [2024-07-21 11:30:32.828580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.419 [2024-07-21 11:30:32.828594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.677 #62 NEW cov: 11750 ft: 14091 corp: 33/1036b lim: 40 exec/s: 62 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:03.678 [2024-07-21 11:30:32.868309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:0000fffd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.868334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.868393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.868407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.868462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.868476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.868536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.868549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.868606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.868619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.908420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00002800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.908448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.908510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.908524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.908581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.908594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.908648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.908661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.908717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.908730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.678 #64 NEW cov: 11750 ft: 14106 corp: 34/1076b lim: 40 exec/s: 64 rss: 70Mb L: 40/40 MS: 2 ChangeBit-ChangeBinInt- 00:07:03.678 [2024-07-21 11:30:32.948530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.948555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.948615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.948628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.948689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000055 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.948702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.948759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.948771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.948828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:55555555 cdw11:5555550a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.948843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.678 #65 NEW cov: 11750 ft: 14176 corp: 35/1116b lim: 40 exec/s: 65 rss: 70Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:03.678 [2024-07-21 11:30:32.988345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a4242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.988371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.988430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42bebdbd cdw11:bdbdbdbd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.988447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:32.988505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:c2424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:32.988518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.678 #66 NEW cov: 11750 ft: 14182 corp: 36/1141b lim: 40 exec/s: 66 rss: 70Mb L: 25/40 MS: 1 ChangeBinInt- 00:07:03.678 [2024-07-21 11:30:33.028495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000003b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:33.028520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:33.028593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:31373737 cdw11:37370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:33.028607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:33.028663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:33.028677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.678 #67 NEW cov: 11750 ft: 14203 corp: 37/1165b lim: 40 exec/s: 67 rss: 70Mb L: 24/40 MS: 1 ChangeByte- 00:07:03.678 [2024-07-21 11:30:33.068569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:33.068594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:33.068653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:33.068667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.678 [2024-07-21 11:30:33.068725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.678 [2024-07-21 11:30:33.068739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.678 #68 NEW cov: 11750 ft: 14207 corp: 38/1189b lim: 40 exec/s: 68 rss: 70Mb L: 24/40 MS: 1 EraseBytes- 00:07:03.936 [2024-07-21 11:30:33.108396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:4affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.108421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.936 #69 NEW cov: 11750 ft: 14964 corp: 39/1202b lim: 40 exec/s: 69 rss: 70Mb L: 13/40 MS: 1 CrossOver- 00:07:03.936 [2024-07-21 11:30:33.149152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.149177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.149237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.149250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.149305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.149319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.149374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.149387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.149445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.149459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.936 #70 NEW cov: 11750 ft: 14977 corp: 40/1242b lim: 40 exec/s: 70 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:03.936 [2024-07-21 11:30:33.189083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.189108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.189167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.189181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.189237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.189250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.189307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.189320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.936 #71 NEW cov: 11750 ft: 15027 corp: 41/1279b lim: 40 exec/s: 71 rss: 70Mb L: 37/40 MS: 1 CopyPart- 00:07:03.936 [2024-07-21 11:30:33.229253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffff7ece SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.229278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.229350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a0112889 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.229364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.229423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.229436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.229507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.229520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.936 #72 NEW cov: 11750 ft: 15055 corp: 42/1318b lim: 40 exec/s: 72 rss: 70Mb L: 39/40 MS: 1 CMP- DE: "\377\377~\316\240\021(\211"- 00:07:03.936 [2024-07-21 11:30:33.269341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.269368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.269426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:1c1c1c1c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.269439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.269499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:1c1c1c1c cdw11:1c1c1c1c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.269513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.269566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:1c1cffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.269579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.936 #73 NEW cov: 11750 ft: 15069 corp: 43/1356b lim: 40 exec/s: 73 rss: 70Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:07:03.936 [2024-07-21 11:30:33.309461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.309487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.309549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ecececec SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.309562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.309636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ecec0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.309650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.936 [2024-07-21 11:30:33.309707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0020ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.936 [2024-07-21 11:30:33.309720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.936 #74 NEW cov: 11750 ft: 15083 corp: 44/1388b lim: 40 exec/s: 37 rss: 70Mb L: 32/40 MS: 1 ChangeBinInt- 00:07:03.936 #74 DONE cov: 11750 ft: 15083 corp: 44/1388b lim: 40 exec/s: 37 rss: 70Mb 00:07:03.936 ###### Recommended dictionary. ###### 00:07:03.936 "\377\377~\316\240\021(\211" # Uses: 0 00:07:03.936 ###### End of recommended dictionary. ###### 00:07:03.936 Done 74 runs in 2 second(s) 00:07:04.195 11:30:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:04.195 11:30:33 -- ../common.sh@72 -- # (( i++ )) 00:07:04.195 11:30:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:04.195 11:30:33 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:04.195 11:30:33 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:04.195 11:30:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:04.195 11:30:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:04.195 11:30:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:04.195 11:30:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:04.195 11:30:33 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:04.195 11:30:33 -- nvmf/run.sh@29 -- # port=4412 00:07:04.195 11:30:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:04.195 11:30:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:04.195 11:30:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:04.195 11:30:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:04.195 [2024-07-21 11:30:33.484589] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:04.195 [2024-07-21 11:30:33.484681] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2064508 ] 00:07:04.195 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.453 [2024-07-21 11:30:33.667577] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.453 [2024-07-21 11:30:33.687982] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:04.453 [2024-07-21 11:30:33.688125] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.453 [2024-07-21 11:30:33.739650] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:04.453 [2024-07-21 11:30:33.756000] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:04.453 INFO: Running with entropic power schedule (0xFF, 100). 00:07:04.453 INFO: Seed: 1437032362 00:07:04.453 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:04.453 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:04.453 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:04.453 INFO: A corpus is not provided, starting from an empty corpus 00:07:04.453 #2 INITED exec/s: 0 rss: 60Mb 00:07:04.453 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:04.453 This may also happen if the target rejected all inputs we tried so far 00:07:04.453 [2024-07-21 11:30:33.811202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-07-21 11:30:33.811232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.711 NEW_FUNC[1/671]: 0x4aefc0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:04.712 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:04.712 #21 NEW cov: 11521 ft: 11521 corp: 2/12b lim: 40 exec/s: 0 rss: 68Mb L: 11/11 MS: 4 CrossOver-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:04.712 [2024-07-21 11:30:34.112278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.712 [2024-07-21 11:30:34.112315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.712 [2024-07-21 11:30:34.112382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.712 [2024-07-21 11:30:34.112399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.712 [2024-07-21 11:30:34.112470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.712 [2024-07-21 11:30:34.112487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.969 #31 NEW cov: 11634 ft: 12801 corp: 3/43b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 5 ChangeBit-ChangeBit-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:04.969 [2024-07-21 11:30:34.151941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.969 [2024-07-21 11:30:34.151967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.969 #32 NEW cov: 11640 ft: 13005 corp: 4/54b lim: 40 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 CopyPart- 00:07:04.969 [2024-07-21 11:30:34.192073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.969 [2024-07-21 11:30:34.192097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.969 #33 NEW cov: 11725 ft: 13209 corp: 5/66b lim: 40 exec/s: 0 rss: 68Mb L: 12/31 MS: 1 CrossOver- 00:07:04.969 [2024-07-21 11:30:34.232517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.969 [2024-07-21 11:30:34.232543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.969 [2024-07-21 11:30:34.232599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffd5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.969 [2024-07-21 11:30:34.232612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.969 [2024-07-21 11:30:34.232666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.969 [2024-07-21 11:30:34.232679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.969 #39 NEW cov: 11725 ft: 13297 corp: 6/97b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:04.969 [2024-07-21 11:30:34.272757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.272781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.272837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffd5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.272850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.272904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.272918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.272972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d5d5b4d5 cdw11:d5d5d5db SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.272987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.970 #40 NEW cov: 11725 ft: 13733 corp: 7/129b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertByte- 00:07:04.970 [2024-07-21 11:30:34.312454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.312480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.970 #41 NEW cov: 11725 ft: 13802 corp: 8/140b lim: 40 exec/s: 0 rss: 69Mb L: 11/32 MS: 1 ChangeBinInt- 00:07:04.970 [2024-07-21 11:30:34.353012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.353036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.353089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.353103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.353157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.353170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.353223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.353236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.970 #42 NEW cov: 11725 ft: 13848 corp: 9/174b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:07:04.970 [2024-07-21 11:30:34.393107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.393132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.393188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.393202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.393256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.393269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.970 [2024-07-21 11:30:34.393320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d5d5d5b4 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.970 [2024-07-21 11:30:34.393333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.228 #48 NEW cov: 11725 ft: 13948 corp: 10/207b lim: 40 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 InsertByte- 00:07:05.228 [2024-07-21 11:30:34.432778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:2a0affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.432803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.228 #49 NEW cov: 11725 ft: 14006 corp: 11/221b lim: 40 exec/s: 0 rss: 69Mb L: 14/34 MS: 1 CrossOver- 00:07:05.228 [2024-07-21 11:30:34.473359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0aff5f5f cdw11:5f5f5f5f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.473384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.473438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:5f5f5f5f cdw11:5f5f5f5f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.473458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.473513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5f5f5f5f cdw11:5f5f5f5f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.473527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.473580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:5f5f5f5f cdw11:5f5f5f5f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.473592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.228 #51 NEW cov: 11725 ft: 14031 corp: 12/259b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:05.228 [2024-07-21 11:30:34.513446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.513471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.513527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.513540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.513593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.513606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.513655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.513668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.228 #52 NEW cov: 11725 ft: 14064 corp: 13/293b lim: 40 exec/s: 0 rss: 69Mb L: 34/38 MS: 1 ChangeBinInt- 00:07:05.228 [2024-07-21 11:30:34.553591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:2a0affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.553617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.553672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff5a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.553686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.553739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.553751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.553809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.553822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.228 #53 NEW cov: 11725 ft: 14076 corp: 14/330b lim: 40 exec/s: 0 rss: 69Mb L: 37/38 MS: 1 CrossOver- 00:07:05.228 [2024-07-21 11:30:34.593208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.593232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.228 #54 NEW cov: 11725 ft: 14118 corp: 15/341b lim: 40 exec/s: 0 rss: 69Mb L: 11/38 MS: 1 ChangeByte- 00:07:05.228 [2024-07-21 11:30:34.633824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.633851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.633909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.633923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.633978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.633991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.228 [2024-07-21 11:30:34.634046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.228 [2024-07-21 11:30:34.634059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.486 #55 NEW cov: 11725 ft: 14163 corp: 16/375b lim: 40 exec/s: 0 rss: 69Mb L: 34/38 MS: 1 ShuffleBytes- 00:07:05.486 [2024-07-21 11:30:34.673633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:2a0affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.673659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.673715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.673729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.486 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:05.486 #56 NEW cov: 11748 ft: 14371 corp: 17/394b lim: 40 exec/s: 0 rss: 70Mb L: 19/38 MS: 1 CrossOver- 00:07:05.486 [2024-07-21 11:30:34.713573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff29ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.713598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 #57 NEW cov: 11748 ft: 14382 corp: 18/406b lim: 40 exec/s: 0 rss: 70Mb L: 12/38 MS: 1 ChangeByte- 00:07:05.486 [2024-07-21 11:30:34.753719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:000006ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.753744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 #58 NEW cov: 11748 ft: 14397 corp: 19/415b lim: 40 exec/s: 0 rss: 70Mb L: 9/38 MS: 1 EraseBytes- 00:07:05.486 [2024-07-21 11:30:34.784383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.784408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.784464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.784478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.784530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.784544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.784596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.784609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.784659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.784672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:05.486 #59 NEW cov: 11748 ft: 14478 corp: 20/455b lim: 40 exec/s: 59 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:05.486 [2024-07-21 11:30:34.824029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffff00 cdw11:00002506 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.824054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.824125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.824139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.486 #60 NEW cov: 11748 ft: 14497 corp: 21/471b lim: 40 exec/s: 60 rss: 70Mb L: 16/40 MS: 1 CrossOver- 00:07:05.486 [2024-07-21 11:30:34.864458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.864483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.864541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff2fff cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.864554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.864608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.864622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.864676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d5d5d5b4 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.864689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.486 #61 NEW cov: 11748 ft: 14536 corp: 22/504b lim: 40 exec/s: 61 rss: 70Mb L: 33/40 MS: 1 InsertByte- 00:07:05.486 [2024-07-21 11:30:34.904257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffff00 cdw11:00002506 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.904282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.486 [2024-07-21 11:30:34.904335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.486 [2024-07-21 11:30:34.904349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.744 #62 NEW cov: 11748 ft: 14577 corp: 23/520b lim: 40 exec/s: 62 rss: 70Mb L: 16/40 MS: 1 ShuffleBytes- 00:07:05.744 [2024-07-21 11:30:34.944720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:34.944746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:34.944801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffdb cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:34.944814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:34.944863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:34.944893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:34.944947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:34.944960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.744 #63 NEW cov: 11748 ft: 14597 corp: 24/556b lim: 40 exec/s: 63 rss: 70Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:07:05.744 [2024-07-21 11:30:34.984341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:34.984366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.744 #64 NEW cov: 11748 ft: 14604 corp: 25/568b lim: 40 exec/s: 64 rss: 70Mb L: 12/40 MS: 1 ShuffleBytes- 00:07:05.744 [2024-07-21 11:30:35.024956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.024981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.025036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffd5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.025049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.025103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.025115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.025170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d5d5d5b4 cdw11:d5b4d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.025185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.744 #65 NEW cov: 11748 ft: 14617 corp: 26/603b lim: 40 exec/s: 65 rss: 70Mb L: 35/40 MS: 1 CopyPart- 00:07:05.744 [2024-07-21 11:30:35.065037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.065063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.065117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffdb cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.065131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.065184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.065199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.065252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.065265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.744 #66 NEW cov: 11748 ft: 14636 corp: 27/641b lim: 40 exec/s: 66 rss: 70Mb L: 38/40 MS: 1 CrossOver- 00:07:05.744 [2024-07-21 11:30:35.105168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.744 [2024-07-21 11:30:35.105194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.744 [2024-07-21 11:30:35.105252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.745 [2024-07-21 11:30:35.105266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.745 [2024-07-21 11:30:35.105319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.745 [2024-07-21 11:30:35.105333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.745 [2024-07-21 11:30:35.105387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00210000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.745 [2024-07-21 11:30:35.105400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.745 #67 NEW cov: 11748 ft: 14652 corp: 28/675b lim: 40 exec/s: 67 rss: 70Mb L: 34/40 MS: 1 ChangeByte- 00:07:05.745 [2024-07-21 11:30:35.145125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.745 [2024-07-21 11:30:35.145151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.745 [2024-07-21 11:30:35.145206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.745 [2024-07-21 11:30:35.145220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.745 [2024-07-21 11:30:35.145275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.745 [2024-07-21 11:30:35.145291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.745 #68 NEW cov: 11748 ft: 14665 corp: 29/706b lim: 40 exec/s: 68 rss: 70Mb L: 31/40 MS: 1 InsertRepeatedBytes- 00:07:06.003 [2024-07-21 11:30:35.185105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.185131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 [2024-07-21 11:30:35.185185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.185198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.003 #69 NEW cov: 11748 ft: 14672 corp: 30/725b lim: 40 exec/s: 69 rss: 70Mb L: 19/40 MS: 1 EraseBytes- 00:07:06.003 [2024-07-21 11:30:35.225108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff2e0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.225133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 #70 NEW cov: 11748 ft: 14712 corp: 31/737b lim: 40 exec/s: 70 rss: 70Mb L: 12/40 MS: 1 InsertByte- 00:07:06.003 [2024-07-21 11:30:35.265329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffff00 cdw11:00402506 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.265353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 [2024-07-21 11:30:35.265408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.265421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.003 #71 NEW cov: 11748 ft: 14742 corp: 32/753b lim: 40 exec/s: 71 rss: 70Mb L: 16/40 MS: 1 ChangeBit- 00:07:06.003 [2024-07-21 11:30:35.305460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffff00 cdw11:0000d706 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.305486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 [2024-07-21 11:30:35.305540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.305554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.003 #72 NEW cov: 11748 ft: 14752 corp: 33/769b lim: 40 exec/s: 72 rss: 70Mb L: 16/40 MS: 1 ChangeBinInt- 00:07:06.003 [2024-07-21 11:30:35.345571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.345597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 [2024-07-21 11:30:35.345652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.345665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.003 #73 NEW cov: 11748 ft: 14787 corp: 34/787b lim: 40 exec/s: 73 rss: 70Mb L: 18/40 MS: 1 EraseBytes- 00:07:06.003 [2024-07-21 11:30:35.385542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:fb010000 cdw11:000006ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.385573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 #74 NEW cov: 11748 ft: 14807 corp: 35/796b lim: 40 exec/s: 74 rss: 70Mb L: 9/40 MS: 1 ChangeBit- 00:07:06.003 [2024-07-21 11:30:35.425866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:282a0aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.425892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.003 [2024-07-21 11:30:35.425946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffd5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.003 [2024-07-21 11:30:35.425960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 #75 NEW cov: 11748 ft: 14816 corp: 36/816b lim: 40 exec/s: 75 rss: 70Mb L: 20/40 MS: 1 InsertByte- 00:07:06.291 [2024-07-21 11:30:35.466239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:012f8f7c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.466264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.466319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fad1d5c6 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.466332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.466383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.466396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.466451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d5d5d5b4 cdw11:d5b4d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.466465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.291 #76 NEW cov: 11748 ft: 14825 corp: 37/851b lim: 40 exec/s: 76 rss: 70Mb L: 35/40 MS: 1 CMP- DE: "\001/\217|\372\321\325\306"- 00:07:06.291 [2024-07-21 11:30:35.506361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.506387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.506445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.506459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.506514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.506543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.506598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00260000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.506611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.291 #77 NEW cov: 11748 ft: 14842 corp: 38/883b lim: 40 exec/s: 77 rss: 70Mb L: 32/40 MS: 1 InsertByte- 00:07:06.291 [2024-07-21 11:30:35.546451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0abfffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.546479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.546534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffdb cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.546548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.546601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.546614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.546666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.546679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.291 #78 NEW cov: 11748 ft: 14856 corp: 39/921b lim: 40 exec/s: 78 rss: 70Mb L: 38/40 MS: 1 ChangeBit- 00:07:06.291 [2024-07-21 11:30:35.586595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.586620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.586675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.586689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.586739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.586752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.586805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.586818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.291 #79 NEW cov: 11748 ft: 14867 corp: 40/955b lim: 40 exec/s: 79 rss: 70Mb L: 34/40 MS: 1 ShuffleBytes- 00:07:06.291 [2024-07-21 11:30:35.626715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.626739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.626794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.626807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.626858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.626871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.626922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.626938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.291 #80 NEW cov: 11748 ft: 14889 corp: 41/989b lim: 40 exec/s: 80 rss: 70Mb L: 34/40 MS: 1 CopyPart- 00:07:06.291 [2024-07-21 11:30:35.666673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0abfffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.666698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.666753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.666766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.291 [2024-07-21 11:30:35.666820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.291 [2024-07-21 11:30:35.666833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.291 #81 NEW cov: 11748 ft: 14902 corp: 42/1020b lim: 40 exec/s: 81 rss: 70Mb L: 31/40 MS: 1 EraseBytes- 00:07:06.578 [2024-07-21 11:30:35.707029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.707054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.707110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:012f8f7c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.707123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.707175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:fad1d5c6 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.707189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.707243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.707256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.578 #82 NEW cov: 11748 ft: 14907 corp: 43/1059b lim: 40 exec/s: 82 rss: 70Mb L: 39/40 MS: 1 PersAutoDict- DE: "\001/\217|\372\321\325\306"- 00:07:06.578 [2024-07-21 11:30:35.747078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.747104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.747158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.747171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.747225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.747238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.747288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.747304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.578 #83 NEW cov: 11748 ft: 14918 corp: 44/1093b lim: 40 exec/s: 83 rss: 70Mb L: 34/40 MS: 1 ShuffleBytes- 00:07:06.578 [2024-07-21 11:30:35.787137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:5a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.787162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.787215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.787229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.787281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:2f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.787294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.578 [2024-07-21 11:30:35.787347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.578 [2024-07-21 11:30:35.787360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.578 #84 NEW cov: 11748 ft: 14942 corp: 45/1128b lim: 40 exec/s: 42 rss: 70Mb L: 35/40 MS: 1 InsertByte- 00:07:06.578 #84 DONE cov: 11748 ft: 14942 corp: 45/1128b lim: 40 exec/s: 42 rss: 70Mb 00:07:06.578 ###### Recommended dictionary. ###### 00:07:06.578 "\001/\217|\372\321\325\306" # Uses: 1 00:07:06.578 ###### End of recommended dictionary. ###### 00:07:06.578 Done 84 runs in 2 second(s) 00:07:06.578 11:30:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:06.578 11:30:35 -- ../common.sh@72 -- # (( i++ )) 00:07:06.578 11:30:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:06.578 11:30:35 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:06.578 11:30:35 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:06.578 11:30:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:06.578 11:30:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:06.578 11:30:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:06.578 11:30:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:06.579 11:30:35 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:06.579 11:30:35 -- nvmf/run.sh@29 -- # port=4413 00:07:06.579 11:30:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:06.579 11:30:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:06.579 11:30:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:06.579 11:30:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:06.579 [2024-07-21 11:30:35.963411] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:06.579 [2024-07-21 11:30:35.963511] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2065047 ] 00:07:06.579 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.837 [2024-07-21 11:30:36.140272] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.837 [2024-07-21 11:30:36.160114] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:06.837 [2024-07-21 11:30:36.160254] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.837 [2024-07-21 11:30:36.211779] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:06.837 [2024-07-21 11:30:36.228115] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:06.838 INFO: Running with entropic power schedule (0xFF, 100). 00:07:06.838 INFO: Seed: 3908029415 00:07:06.838 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:06.838 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:06.838 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:06.838 INFO: A corpus is not provided, starting from an empty corpus 00:07:06.838 #2 INITED exec/s: 0 rss: 60Mb 00:07:06.838 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:06.838 This may also happen if the target rejected all inputs we tried so far 00:07:07.096 [2024-07-21 11:30:36.273304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.097 [2024-07-21 11:30:36.273332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 NEW_FUNC[1/670]: 0x4b0b80 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:07.361 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:07.361 #5 NEW cov: 11509 ft: 11510 corp: 2/12b lim: 40 exec/s: 0 rss: 68Mb L: 11/11 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:07.361 [2024-07-21 11:30:36.574011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.574043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 #11 NEW cov: 11622 ft: 12045 corp: 3/23b lim: 40 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeBit- 00:07:07.361 [2024-07-21 11:30:36.614063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.614089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 #12 NEW cov: 11628 ft: 12235 corp: 4/34b lim: 40 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeByte- 00:07:07.361 [2024-07-21 11:30:36.654312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.654338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 [2024-07-21 11:30:36.654394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fbffffff cdw11:ffff32ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.654408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.361 #13 NEW cov: 11713 ft: 12755 corp: 5/53b lim: 40 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CrossOver- 00:07:07.361 [2024-07-21 11:30:36.694337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.694362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 #14 NEW cov: 11713 ft: 12898 corp: 6/65b lim: 40 exec/s: 0 rss: 68Mb L: 12/19 MS: 1 CopyPart- 00:07:07.361 [2024-07-21 11:30:36.734418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a3fff cdw11:fbffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.734452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 #15 NEW cov: 11713 ft: 13052 corp: 7/77b lim: 40 exec/s: 0 rss: 68Mb L: 12/19 MS: 1 InsertByte- 00:07:07.361 [2024-07-21 11:30:36.764638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.764662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.361 [2024-07-21 11:30:36.764735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff32ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.361 [2024-07-21 11:30:36.764749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.620 #16 NEW cov: 11713 ft: 13086 corp: 8/96b lim: 40 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 CrossOver- 00:07:07.620 [2024-07-21 11:30:36.804665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:630afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.620 [2024-07-21 11:30:36.804690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.620 #19 NEW cov: 11713 ft: 13173 corp: 9/107b lim: 40 exec/s: 0 rss: 69Mb L: 11/19 MS: 3 ChangeByte-CopyPart-CrossOver- 00:07:07.621 [2024-07-21 11:30:36.844755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.844780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.621 #20 NEW cov: 11713 ft: 13254 corp: 10/119b lim: 40 exec/s: 0 rss: 69Mb L: 12/19 MS: 1 CMP- DE: "\020\000\000\000"- 00:07:07.621 [2024-07-21 11:30:36.884931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff26 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.884956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.621 #21 NEW cov: 11713 ft: 13357 corp: 11/131b lim: 40 exec/s: 0 rss: 69Mb L: 12/19 MS: 1 InsertByte- 00:07:07.621 [2024-07-21 11:30:36.925029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:29ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.925056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.621 #22 NEW cov: 11713 ft: 13370 corp: 12/142b lim: 40 exec/s: 0 rss: 69Mb L: 11/19 MS: 1 ChangeByte- 00:07:07.621 [2024-07-21 11:30:36.955498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.955523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.621 [2024-07-21 11:30:36.955599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.955613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.621 [2024-07-21 11:30:36.955670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.955683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.621 [2024-07-21 11:30:36.955750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.955766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.621 #23 NEW cov: 11713 ft: 13899 corp: 13/181b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:07.621 [2024-07-21 11:30:36.995216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:630affff cdw11:fffbffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:36.995242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.621 #24 NEW cov: 11713 ft: 13917 corp: 14/192b lim: 40 exec/s: 0 rss: 69Mb L: 11/39 MS: 1 ShuffleBytes- 00:07:07.621 [2024-07-21 11:30:37.035312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.621 [2024-07-21 11:30:37.035338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 #25 NEW cov: 11713 ft: 13932 corp: 15/203b lim: 40 exec/s: 0 rss: 69Mb L: 11/39 MS: 1 ChangeBit- 00:07:07.880 [2024-07-21 11:30:37.075851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.075876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 [2024-07-21 11:30:37.075935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.075948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.880 [2024-07-21 11:30:37.076003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.076017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.880 [2024-07-21 11:30:37.076071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.076084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.880 #26 NEW cov: 11713 ft: 13961 corp: 16/242b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 ChangeASCIIInt- 00:07:07.880 [2024-07-21 11:30:37.115579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:2dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.115604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 #27 NEW cov: 11713 ft: 13972 corp: 17/253b lim: 40 exec/s: 0 rss: 69Mb L: 11/39 MS: 1 ChangeByte- 00:07:07.880 [2024-07-21 11:30:37.155713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff26 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.155739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:07.880 #28 NEW cov: 11736 ft: 14002 corp: 18/265b lim: 40 exec/s: 0 rss: 70Mb L: 12/39 MS: 1 ChangeBinInt- 00:07:07.880 [2024-07-21 11:30:37.195850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.195875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 #29 NEW cov: 11736 ft: 14015 corp: 19/276b lim: 40 exec/s: 0 rss: 70Mb L: 11/39 MS: 1 ChangeBit- 00:07:07.880 [2024-07-21 11:30:37.225898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.225922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 #30 NEW cov: 11736 ft: 14037 corp: 20/288b lim: 40 exec/s: 0 rss: 70Mb L: 12/39 MS: 1 ChangeBinInt- 00:07:07.880 [2024-07-21 11:30:37.266007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff00 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.880 [2024-07-21 11:30:37.266032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.880 #31 NEW cov: 11736 ft: 14048 corp: 21/300b lim: 40 exec/s: 31 rss: 70Mb L: 12/39 MS: 1 ChangeBinInt- 00:07:08.139 [2024-07-21 11:30:37.306290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.306316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 [2024-07-21 11:30:37.306372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff33ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.306386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.139 #32 NEW cov: 11736 ft: 14066 corp: 22/319b lim: 40 exec/s: 32 rss: 70Mb L: 19/39 MS: 1 ChangeASCIIInt- 00:07:08.139 [2024-07-21 11:30:37.346235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff40 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.346260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 #33 NEW cov: 11736 ft: 14084 corp: 23/330b lim: 40 exec/s: 33 rss: 70Mb L: 11/39 MS: 1 ChangeByte- 00:07:08.139 [2024-07-21 11:30:37.376742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.376768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 [2024-07-21 11:30:37.376824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.376838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.139 [2024-07-21 11:30:37.376907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.376921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.139 [2024-07-21 11:30:37.376976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.376989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.139 #34 NEW cov: 11736 ft: 14105 corp: 24/369b lim: 40 exec/s: 34 rss: 70Mb L: 39/39 MS: 1 ChangeASCIIInt- 00:07:08.139 [2024-07-21 11:30:37.416461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a3fff cdw11:fbffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.416489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 #35 NEW cov: 11736 ft: 14130 corp: 25/381b lim: 40 exec/s: 35 rss: 70Mb L: 12/39 MS: 1 PersAutoDict- DE: "\020\000\000\000"- 00:07:08.139 [2024-07-21 11:30:37.456519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a0aff cdw11:26ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.456543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 #36 NEW cov: 11736 ft: 14174 corp: 26/394b lim: 40 exec/s: 36 rss: 70Mb L: 13/39 MS: 1 CrossOver- 00:07:08.139 [2024-07-21 11:30:37.496671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a7efb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.496696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 #37 NEW cov: 11736 ft: 14178 corp: 27/405b lim: 40 exec/s: 37 rss: 70Mb L: 11/39 MS: 1 ChangeByte- 00:07:08.139 [2024-07-21 11:30:37.526761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff3a cdw11:26ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.139 [2024-07-21 11:30:37.526785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.139 #38 NEW cov: 11736 ft: 14210 corp: 28/418b lim: 40 exec/s: 38 rss: 70Mb L: 13/39 MS: 1 InsertByte- 00:07:08.398 [2024-07-21 11:30:37.566846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff3a cdw11:26ffdfff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.566871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 #39 NEW cov: 11736 ft: 14228 corp: 29/431b lim: 40 exec/s: 39 rss: 70Mb L: 13/39 MS: 1 ChangeBit- 00:07:08.398 [2024-07-21 11:30:37.606969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff40 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.606993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 #40 NEW cov: 11736 ft: 14234 corp: 30/440b lim: 40 exec/s: 40 rss: 70Mb L: 9/39 MS: 1 EraseBytes- 00:07:08.398 [2024-07-21 11:30:37.647135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.647160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 #41 NEW cov: 11736 ft: 14244 corp: 31/452b lim: 40 exec/s: 41 rss: 70Mb L: 12/39 MS: 1 CopyPart- 00:07:08.398 [2024-07-21 11:30:37.677282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:fffbffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.677307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 [2024-07-21 11:30:37.677366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff32 cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.677379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.398 #42 NEW cov: 11736 ft: 14256 corp: 32/475b lim: 40 exec/s: 42 rss: 70Mb L: 23/39 MS: 1 CrossOver- 00:07:08.398 [2024-07-21 11:30:37.717238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:29ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.717262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 #43 NEW cov: 11736 ft: 14261 corp: 33/486b lim: 40 exec/s: 43 rss: 70Mb L: 11/39 MS: 1 CrossOver- 00:07:08.398 [2024-07-21 11:30:37.747343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff00 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.747368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 #44 NEW cov: 11736 ft: 14281 corp: 34/498b lim: 40 exec/s: 44 rss: 70Mb L: 12/39 MS: 1 CopyPart- 00:07:08.398 [2024-07-21 11:30:37.787456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a3fff cdw11:fbffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.398 [2024-07-21 11:30:37.787481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.398 #45 NEW cov: 11736 ft: 14292 corp: 35/510b lim: 40 exec/s: 45 rss: 70Mb L: 12/39 MS: 1 CopyPart- 00:07:08.657 [2024-07-21 11:30:37.827783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.827807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 [2024-07-21 11:30:37.827881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff34ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.827895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.657 #46 NEW cov: 11736 ft: 14298 corp: 36/529b lim: 40 exec/s: 46 rss: 70Mb L: 19/39 MS: 1 ChangeASCIIInt- 00:07:08.657 [2024-07-21 11:30:37.867779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0afffb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.867803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 #47 NEW cov: 11736 ft: 14303 corp: 37/541b lim: 40 exec/s: 47 rss: 70Mb L: 12/39 MS: 1 CMP- DE: "\001\000"- 00:07:08.657 [2024-07-21 11:30:37.898226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a1aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.898251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 [2024-07-21 11:30:37.898310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.898323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.657 [2024-07-21 11:30:37.898395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.898409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.657 [2024-07-21 11:30:37.898470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.898484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.657 #48 NEW cov: 11736 ft: 14314 corp: 38/580b lim: 40 exec/s: 48 rss: 70Mb L: 39/39 MS: 1 ChangeBit- 00:07:08.657 [2024-07-21 11:30:37.937947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a0aff cdw11:26ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.937971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 #49 NEW cov: 11736 ft: 14327 corp: 39/594b lim: 40 exec/s: 49 rss: 70Mb L: 14/39 MS: 1 CopyPart- 00:07:08.657 [2024-07-21 11:30:37.978099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:020aff26 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:37.978125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 #50 NEW cov: 11736 ft: 14342 corp: 40/606b lim: 40 exec/s: 50 rss: 70Mb L: 12/39 MS: 1 ChangeBit- 00:07:08.657 [2024-07-21 11:30:38.008154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a0a0d cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:38.008179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 [2024-07-21 11:30:38.038259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0d00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:38.038284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.657 #52 NEW cov: 11736 ft: 14348 corp: 41/619b lim: 40 exec/s: 52 rss: 70Mb L: 13/39 MS: 2 ChangeBinInt-CopyPart- 00:07:08.657 [2024-07-21 11:30:38.068366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:630affff cdw11:fffbffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.657 [2024-07-21 11:30:38.068391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.917 #53 NEW cov: 11736 ft: 14352 corp: 42/630b lim: 40 exec/s: 53 rss: 70Mb L: 11/39 MS: 1 ChangeASCIIInt- 00:07:08.917 [2024-07-21 11:30:38.108732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.108758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.917 [2024-07-21 11:30:38.108817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff0e0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.108830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.917 [2024-07-21 11:30:38.108890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff32ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.108903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.917 #54 NEW cov: 11736 ft: 14535 corp: 43/657b lim: 40 exec/s: 54 rss: 70Mb L: 27/39 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\016"- 00:07:08.917 [2024-07-21 11:30:38.148627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff33 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.148652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.917 #55 NEW cov: 11736 ft: 14539 corp: 44/670b lim: 40 exec/s: 55 rss: 70Mb L: 13/39 MS: 1 EraseBytes- 00:07:08.917 [2024-07-21 11:30:38.188735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:10000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.188761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.917 #56 NEW cov: 11736 ft: 14552 corp: 45/678b lim: 40 exec/s: 56 rss: 70Mb L: 8/39 MS: 1 EraseBytes- 00:07:08.917 [2024-07-21 11:30:38.229274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aff0a cdw11:0afffbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.229304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.917 [2024-07-21 11:30:38.229359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.229373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.917 [2024-07-21 11:30:38.229426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.229439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.917 [2024-07-21 11:30:38.229497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.229510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.917 #57 NEW cov: 11736 ft: 14626 corp: 46/717b lim: 40 exec/s: 57 rss: 70Mb L: 39/39 MS: 1 ChangeBit- 00:07:08.917 [2024-07-21 11:30:38.268971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.917 [2024-07-21 11:30:38.268996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.918 #58 NEW cov: 11736 ft: 14631 corp: 47/726b lim: 40 exec/s: 29 rss: 71Mb L: 9/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\016"- 00:07:08.918 #58 DONE cov: 11736 ft: 14631 corp: 47/726b lim: 40 exec/s: 29 rss: 71Mb 00:07:08.918 ###### Recommended dictionary. ###### 00:07:08.918 "\020\000\000\000" # Uses: 1 00:07:08.918 "\001\000" # Uses: 0 00:07:08.918 "\377\377\377\377\377\377\377\016" # Uses: 1 00:07:08.918 ###### End of recommended dictionary. ###### 00:07:08.918 Done 58 runs in 2 second(s) 00:07:09.177 11:30:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:09.177 11:30:38 -- ../common.sh@72 -- # (( i++ )) 00:07:09.177 11:30:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:09.177 11:30:38 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:09.177 11:30:38 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:09.177 11:30:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:09.177 11:30:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:09.177 11:30:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:09.177 11:30:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:09.177 11:30:38 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:09.177 11:30:38 -- nvmf/run.sh@29 -- # port=4414 00:07:09.177 11:30:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:09.177 11:30:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:09.177 11:30:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:09.177 11:30:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:09.177 [2024-07-21 11:30:38.452354] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:09.177 [2024-07-21 11:30:38.452458] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2065459 ] 00:07:09.177 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.436 [2024-07-21 11:30:38.631132] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.436 [2024-07-21 11:30:38.652264] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:09.436 [2024-07-21 11:30:38.652393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.436 [2024-07-21 11:30:38.704238] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:09.436 [2024-07-21 11:30:38.720575] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:09.436 INFO: Running with entropic power schedule (0xFF, 100). 00:07:09.436 INFO: Seed: 2107065518 00:07:09.436 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:09.436 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:09.436 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:09.436 INFO: A corpus is not provided, starting from an empty corpus 00:07:09.436 #2 INITED exec/s: 0 rss: 60Mb 00:07:09.436 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:09.436 This may also happen if the target rejected all inputs we tried so far 00:07:09.436 [2024-07-21 11:30:38.776204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.436 [2024-07-21 11:30:38.776235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.436 [2024-07-21 11:30:38.776293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.436 [2024-07-21 11:30:38.776309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.436 [2024-07-21 11:30:38.776365] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.436 [2024-07-21 11:30:38.776380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.436 [2024-07-21 11:30:38.776433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.436 [2024-07-21 11:30:38.776452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.695 NEW_FUNC[1/672]: 0x4b2740 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:09.695 NEW_FUNC[2/672]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:09.695 #13 NEW cov: 11513 ft: 11514 corp: 2/32b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:09.695 [2024-07-21 11:30:39.086980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.695 [2024-07-21 11:30:39.087012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.695 [2024-07-21 11:30:39.087091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.695 [2024-07-21 11:30:39.087108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.695 [2024-07-21 11:30:39.087171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.695 [2024-07-21 11:30:39.087186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.695 #14 NEW cov: 11626 ft: 12326 corp: 3/55b lim: 35 exec/s: 0 rss: 68Mb L: 23/31 MS: 1 CrossOver- 00:07:09.965 [2024-07-21 11:30:39.127431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.127469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.127531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.127546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.127609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.127625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.127683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.127700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.965 #15 NEW cov: 11632 ft: 12501 corp: 4/86b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeByte- 00:07:09.965 [2024-07-21 11:30:39.167312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.167340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.167405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.167421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.167483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.167498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.167532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.167546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.965 #16 NEW cov: 11717 ft: 12827 corp: 5/117b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBit- 00:07:09.965 [2024-07-21 11:30:39.207428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.207460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.207541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.207559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.207621] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.207637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.207696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.207712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.965 #22 NEW cov: 11717 ft: 12881 corp: 6/149b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertByte- 00:07:09.965 [2024-07-21 11:30:39.247555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.247582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.247646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.247663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.247721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.247737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.247802] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.247817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.965 #23 NEW cov: 11717 ft: 12956 corp: 7/180b lim: 35 exec/s: 0 rss: 68Mb L: 31/32 MS: 1 ChangeBinInt- 00:07:09.965 [2024-07-21 11:30:39.287691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.287719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.287781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.287798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.287856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.287872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.287932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.287947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.965 #24 NEW cov: 11717 ft: 13056 corp: 8/211b lim: 35 exec/s: 0 rss: 68Mb L: 31/32 MS: 1 ChangeByte- 00:07:09.965 [2024-07-21 11:30:39.327793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.327820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.327883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.327899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.327961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.327975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.328040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.328053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.965 #25 NEW cov: 11724 ft: 13126 corp: 9/242b lim: 35 exec/s: 0 rss: 68Mb L: 31/32 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:09.965 [2024-07-21 11:30:39.367942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.367970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.368032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.368048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.965 [2024-07-21 11:30:39.368125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.965 [2024-07-21 11:30:39.368142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.966 [2024-07-21 11:30:39.368203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.966 [2024-07-21 11:30:39.368219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.966 #26 NEW cov: 11724 ft: 13170 corp: 10/273b lim: 35 exec/s: 0 rss: 68Mb L: 31/32 MS: 1 ChangeByte- 00:07:10.225 [2024-07-21 11:30:39.408209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.408238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.225 [2024-07-21 11:30:39.408315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.408329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.225 [2024-07-21 11:30:39.408390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.408407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.225 [2024-07-21 11:30:39.408469] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.408485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.225 [2024-07-21 11:30:39.408545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.408560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:10.225 #27 NEW cov: 11724 ft: 13278 corp: 11/308b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:10.225 [2024-07-21 11:30:39.448292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.448319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.225 [2024-07-21 11:30:39.448382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.225 [2024-07-21 11:30:39.448397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.225 [2024-07-21 11:30:39.448464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.448480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.448538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.448553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.448613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.448630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:10.226 #28 NEW cov: 11724 ft: 13282 corp: 12/343b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:10.226 [2024-07-21 11:30:39.488431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.488462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.488527] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.488542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.488604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.488620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.488682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.488698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.488759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.488775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:10.226 #29 NEW cov: 11724 ft: 13338 corp: 13/378b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:10.226 [2024-07-21 11:30:39.528383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.528410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.528476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.528493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.528556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.528571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.528628] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.528642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.226 #30 NEW cov: 11724 ft: 13376 corp: 14/409b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ChangeByte- 00:07:10.226 [2024-07-21 11:30:39.568433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.568464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.568528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.568543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.568607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.568622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.568683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.568700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.226 #31 NEW cov: 11724 ft: 13463 corp: 15/442b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 InsertByte- 00:07:10.226 [2024-07-21 11:30:39.608599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.608627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.608708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.608725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.608786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.608803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.608863] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.608879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.226 #32 NEW cov: 11724 ft: 13528 corp: 16/473b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ChangeByte- 00:07:10.226 [2024-07-21 11:30:39.648755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.648783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.648848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.648865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.648931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.648946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.226 [2024-07-21 11:30:39.649006] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.226 [2024-07-21 11:30:39.649025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.485 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:10.486 #33 NEW cov: 11747 ft: 13636 corp: 17/504b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ShuffleBytes- 00:07:10.486 [2024-07-21 11:30:39.688840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.688867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.688931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.688948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.689012] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.689028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.689084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.689100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.486 #34 NEW cov: 11747 ft: 13668 corp: 18/533b lim: 35 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 EraseBytes- 00:07:10.486 [2024-07-21 11:30:39.728950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.728978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.729043] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.729060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.729121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.729137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.729195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.729211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.486 #35 NEW cov: 11747 ft: 13691 corp: 19/564b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ChangeBinInt- 00:07:10.486 [2024-07-21 11:30:39.769066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.769095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.769158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.769173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.769233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.769252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.769313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.769329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.486 #36 NEW cov: 11747 ft: 13740 corp: 20/597b lim: 35 exec/s: 36 rss: 69Mb L: 33/35 MS: 1 ChangeByte- 00:07:10.486 [2024-07-21 11:30:39.809234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.809263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.809327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.809343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.809404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.809420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.486 NEW_FUNC[1/1]: 0x11719a0 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:07:10.486 #37 NEW cov: 11770 ft: 13837 corp: 21/628b lim: 35 exec/s: 37 rss: 69Mb L: 31/35 MS: 1 ChangeByte- 00:07:10.486 [2024-07-21 11:30:39.849295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.849323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.849402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.849420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.849480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.849496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.849559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.849576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.486 #38 NEW cov: 11770 ft: 13840 corp: 22/660b lim: 35 exec/s: 38 rss: 69Mb L: 32/35 MS: 1 CopyPart- 00:07:10.486 [2024-07-21 11:30:39.879397] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.879425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.879491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.879507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.879568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.879587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.486 [2024-07-21 11:30:39.879650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.486 [2024-07-21 11:30:39.879666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.486 #39 NEW cov: 11770 ft: 13847 corp: 23/691b lim: 35 exec/s: 39 rss: 69Mb L: 31/35 MS: 1 CopyPart- 00:07:10.746 [2024-07-21 11:30:39.919558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.919586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.919648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.919664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.919740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.919756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.919818] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.919835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 #40 NEW cov: 11770 ft: 13900 corp: 24/722b lim: 35 exec/s: 40 rss: 69Mb L: 31/35 MS: 1 ChangeBit- 00:07:10.746 [2024-07-21 11:30:39.959650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.959678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.959740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.959757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.959820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.959835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.959896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.959910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 #41 NEW cov: 11770 ft: 13917 corp: 25/751b lim: 35 exec/s: 41 rss: 69Mb L: 29/35 MS: 1 ChangeBinInt- 00:07:10.746 [2024-07-21 11:30:39.999778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.999806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.999868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.999880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:39.999947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:39.999964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.000026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.000044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 #42 NEW cov: 11770 ft: 13959 corp: 26/782b lim: 35 exec/s: 42 rss: 69Mb L: 31/35 MS: 1 ShuffleBytes- 00:07:10.746 [2024-07-21 11:30:40.040094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.040124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.040204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.040220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.040283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.040301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.040364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.040380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:10.746 #43 NEW cov: 11770 ft: 13970 corp: 27/817b lim: 35 exec/s: 43 rss: 69Mb L: 35/35 MS: 1 ChangeBit- 00:07:10.746 [2024-07-21 11:30:40.080039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.080069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.080134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.080148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.080211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.080228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.080293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.080308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 #44 NEW cov: 11770 ft: 14003 corp: 28/848b lim: 35 exec/s: 44 rss: 69Mb L: 31/35 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:10.746 [2024-07-21 11:30:40.120217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.120245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.120307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.120326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.120387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.120403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.120467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.120483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 #45 NEW cov: 11770 ft: 14047 corp: 29/882b lim: 35 exec/s: 45 rss: 70Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:10.746 [2024-07-21 11:30:40.160439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.160472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.160535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000fa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.160550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.160611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.160626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.160686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.160703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.746 [2024-07-21 11:30:40.160765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.746 [2024-07-21 11:30:40.160780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:11.006 #46 NEW cov: 11770 ft: 14059 corp: 30/917b lim: 35 exec/s: 46 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:11.006 [2024-07-21 11:30:40.200377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.200405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.006 [2024-07-21 11:30:40.200467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.200482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.006 [2024-07-21 11:30:40.200559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.200575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.006 [2024-07-21 11:30:40.200636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.200653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.006 #47 NEW cov: 11770 ft: 14064 corp: 31/948b lim: 35 exec/s: 47 rss: 70Mb L: 31/35 MS: 1 ShuffleBytes- 00:07:11.006 [2024-07-21 11:30:40.240494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.240521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.006 [2024-07-21 11:30:40.240584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.240600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.006 [2024-07-21 11:30:40.240660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.240676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.006 [2024-07-21 11:30:40.240736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.006 [2024-07-21 11:30:40.240751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.006 #48 NEW cov: 11770 ft: 14076 corp: 32/979b lim: 35 exec/s: 48 rss: 70Mb L: 31/35 MS: 1 ChangeBinInt- 00:07:11.006 [2024-07-21 11:30:40.280612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.280640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.280703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.280719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.280779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.280795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.280853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.280869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.007 #49 NEW cov: 11770 ft: 14129 corp: 33/1010b lim: 35 exec/s: 49 rss: 70Mb L: 31/35 MS: 1 ShuffleBytes- 00:07:11.007 [2024-07-21 11:30:40.320932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.320960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.321020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.321034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.321095] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.321110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.321169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.321189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.321247] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.321264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:11.007 #50 NEW cov: 11770 ft: 14138 corp: 34/1045b lim: 35 exec/s: 50 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:11.007 [2024-07-21 11:30:40.360856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.360884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.360946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.360960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.361019] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.361034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.361095] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.361108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.007 #51 NEW cov: 11770 ft: 14147 corp: 35/1076b lim: 35 exec/s: 51 rss: 70Mb L: 31/35 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:11.007 [2024-07-21 11:30:40.400959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.400987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.401049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.401062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.401121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.401136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.007 [2024-07-21 11:30:40.401196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.007 [2024-07-21 11:30:40.401210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.007 #52 NEW cov: 11770 ft: 14213 corp: 36/1109b lim: 35 exec/s: 52 rss: 70Mb L: 33/35 MS: 1 CopyPart- 00:07:11.266 [2024-07-21 11:30:40.441091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.266 [2024-07-21 11:30:40.441119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.266 [2024-07-21 11:30:40.441181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.266 [2024-07-21 11:30:40.441194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.266 [2024-07-21 11:30:40.441261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.266 [2024-07-21 11:30:40.441278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.266 [2024-07-21 11:30:40.441336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.266 [2024-07-21 11:30:40.441351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.266 #53 NEW cov: 11770 ft: 14230 corp: 37/1140b lim: 35 exec/s: 53 rss: 70Mb L: 31/35 MS: 1 CrossOver- 00:07:11.266 [2024-07-21 11:30:40.480837] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.266 [2024-07-21 11:30:40.480864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.266 [2024-07-21 11:30:40.480925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.266 [2024-07-21 11:30:40.480941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.266 #54 NEW cov: 11770 ft: 14424 corp: 38/1159b lim: 35 exec/s: 54 rss: 70Mb L: 19/35 MS: 1 CrossOver- 00:07:11.267 [2024-07-21 11:30:40.521260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.521287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.521346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.521362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.521420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.521436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.521499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.521514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.561425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.561454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.561515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.561531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.561590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.561604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.561665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.561679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.267 #56 NEW cov: 11770 ft: 14436 corp: 39/1191b lim: 35 exec/s: 56 rss: 70Mb L: 32/35 MS: 2 InsertByte-ChangeBinInt- 00:07:11.267 [2024-07-21 11:30:40.601580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.601608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.601668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.601685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.601745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.601761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.267 #57 NEW cov: 11770 ft: 14472 corp: 40/1222b lim: 35 exec/s: 57 rss: 70Mb L: 31/35 MS: 1 ChangeByte- 00:07:11.267 [2024-07-21 11:30:40.641501] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.641527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.641587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.641603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.641664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.641680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.267 #58 NEW cov: 11770 ft: 14488 corp: 41/1246b lim: 35 exec/s: 58 rss: 70Mb L: 24/35 MS: 1 EraseBytes- 00:07:11.267 [2024-07-21 11:30:40.681787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.681815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.681877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.681891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.681951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.681967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.267 [2024-07-21 11:30:40.682028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.267 [2024-07-21 11:30:40.682044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.527 #59 NEW cov: 11770 ft: 14564 corp: 42/1275b lim: 35 exec/s: 59 rss: 70Mb L: 29/35 MS: 1 EraseBytes- 00:07:11.527 [2024-07-21 11:30:40.721364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.527 [2024-07-21 11:30:40.721391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.527 #60 NEW cov: 11770 ft: 15280 corp: 43/1285b lim: 35 exec/s: 60 rss: 70Mb L: 10/35 MS: 1 CrossOver- 00:07:11.527 [2024-07-21 11:30:40.771674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.527 [2024-07-21 11:30:40.771702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.527 [2024-07-21 11:30:40.771764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.527 [2024-07-21 11:30:40.771778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.527 #61 NEW cov: 11770 ft: 15291 corp: 44/1303b lim: 35 exec/s: 30 rss: 70Mb L: 18/35 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:11.527 #61 DONE cov: 11770 ft: 15291 corp: 44/1303b lim: 35 exec/s: 30 rss: 70Mb 00:07:11.527 ###### Recommended dictionary. ###### 00:07:11.527 "\001\000\000\000\000\000\000\000" # Uses: 2 00:07:11.527 "\000\000\000\000" # Uses: 1 00:07:11.527 ###### End of recommended dictionary. ###### 00:07:11.527 Done 61 runs in 2 second(s) 00:07:11.527 11:30:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:11.527 11:30:40 -- ../common.sh@72 -- # (( i++ )) 00:07:11.527 11:30:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:11.527 11:30:40 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:11.527 11:30:40 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:11.527 11:30:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:11.527 11:30:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:11.527 11:30:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:11.527 11:30:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:11.527 11:30:40 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:11.527 11:30:40 -- nvmf/run.sh@29 -- # port=4415 00:07:11.527 11:30:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:11.527 11:30:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:11.527 11:30:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:11.527 11:30:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:11.786 [2024-07-21 11:30:40.953846] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:11.786 [2024-07-21 11:30:40.953914] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2065878 ] 00:07:11.786 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.786 [2024-07-21 11:30:41.140025] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.786 [2024-07-21 11:30:41.159953] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:11.786 [2024-07-21 11:30:41.160095] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.046 [2024-07-21 11:30:41.211615] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:12.046 [2024-07-21 11:30:41.227957] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:12.046 INFO: Running with entropic power schedule (0xFF, 100). 00:07:12.046 INFO: Seed: 319091592 00:07:12.046 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:12.046 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:12.046 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:12.046 INFO: A corpus is not provided, starting from an empty corpus 00:07:12.046 #2 INITED exec/s: 0 rss: 60Mb 00:07:12.046 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:12.046 This may also happen if the target rejected all inputs we tried so far 00:07:12.046 [2024-07-21 11:30:41.283437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.046 [2024-07-21 11:30:41.283472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.046 [2024-07-21 11:30:41.283529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.046 [2024-07-21 11:30:41.283544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.304 NEW_FUNC[1/671]: 0x4b3c80 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:12.304 NEW_FUNC[2/671]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:12.304 #18 NEW cov: 11505 ft: 11506 corp: 2/28b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:12.304 #24 NEW cov: 11618 ft: 12233 corp: 3/35b lim: 35 exec/s: 0 rss: 68Mb L: 7/27 MS: 1 CrossOver- 00:07:12.304 [2024-07-21 11:30:41.633949] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.304 [2024-07-21 11:30:41.633982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.304 #29 NEW cov: 11624 ft: 12592 corp: 4/44b lim: 35 exec/s: 0 rss: 68Mb L: 9/27 MS: 5 InsertByte-ChangeByte-InsertByte-ChangeBit-CrossOver- 00:07:12.304 [2024-07-21 11:30:41.674368] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.304 [2024-07-21 11:30:41.674394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.304 [2024-07-21 11:30:41.674454] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.304 [2024-07-21 11:30:41.674467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.304 #30 NEW cov: 11709 ft: 12944 corp: 5/66b lim: 35 exec/s: 0 rss: 68Mb L: 22/27 MS: 1 EraseBytes- 00:07:12.304 [2024-07-21 11:30:41.714128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.304 [2024-07-21 11:30:41.714154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.562 #34 NEW cov: 11709 ft: 13023 corp: 6/73b lim: 35 exec/s: 0 rss: 68Mb L: 7/27 MS: 4 EraseBytes-ChangeBit-ShuffleBytes-InsertByte- 00:07:12.562 [2024-07-21 11:30:41.754486] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.562 [2024-07-21 11:30:41.754512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.563 #35 NEW cov: 11709 ft: 13146 corp: 7/88b lim: 35 exec/s: 0 rss: 68Mb L: 15/27 MS: 1 CMP- DE: "\006\020\306\003|\217/\000"- 00:07:12.563 #36 NEW cov: 11709 ft: 13186 corp: 8/95b lim: 35 exec/s: 0 rss: 68Mb L: 7/27 MS: 1 ChangeBit- 00:07:12.563 [2024-07-21 11:30:41.834535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.834561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.563 #37 NEW cov: 11709 ft: 13275 corp: 9/108b lim: 35 exec/s: 0 rss: 68Mb L: 13/27 MS: 1 CrossOver- 00:07:12.563 [2024-07-21 11:30:41.874618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:4 cdw10:0000040d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.874647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.563 #38 NEW cov: 11709 ft: 13297 corp: 10/121b lim: 35 exec/s: 0 rss: 69Mb L: 13/27 MS: 1 ChangeBinInt- 00:07:12.563 [2024-07-21 11:30:41.915000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000324 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.915025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.563 [2024-07-21 11:30:41.915083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000037a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.915096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.563 [2024-07-21 11:30:41.915154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000037a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.915168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.563 #41 NEW cov: 11709 ft: 13438 corp: 11/145b lim: 35 exec/s: 0 rss: 69Mb L: 24/27 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:12.563 [2024-07-21 11:30:41.945091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:4 cdw10:0000040d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.945116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.563 [2024-07-21 11:30:41.945171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.945185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.563 [2024-07-21 11:30:41.945242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.945256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.563 #42 NEW cov: 11709 ft: 13485 corp: 12/170b lim: 35 exec/s: 0 rss: 69Mb L: 25/27 MS: 1 CopyPart- 00:07:12.563 [2024-07-21 11:30:41.985158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.563 [2024-07-21 11:30:41.985184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.821 #43 NEW cov: 11709 ft: 13519 corp: 13/185b lim: 35 exec/s: 0 rss: 69Mb L: 15/27 MS: 1 ChangeBit- 00:07:12.821 [2024-07-21 11:30:42.025343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.025368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.821 [2024-07-21 11:30:42.025428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.025446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.821 [2024-07-21 11:30:42.025505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.025519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.821 #44 NEW cov: 11709 ft: 13552 corp: 14/207b lim: 35 exec/s: 0 rss: 69Mb L: 22/27 MS: 1 CrossOver- 00:07:12.821 #45 NEW cov: 11709 ft: 13646 corp: 15/214b lim: 35 exec/s: 0 rss: 69Mb L: 7/27 MS: 1 ShuffleBytes- 00:07:12.821 [2024-07-21 11:30:42.095457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.095483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.821 #46 NEW cov: 11709 ft: 13711 corp: 16/229b lim: 35 exec/s: 0 rss: 69Mb L: 15/27 MS: 1 ChangeByte- 00:07:12.821 [2024-07-21 11:30:42.135701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.135727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.821 [2024-07-21 11:30:42.135786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.135800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.821 #47 NEW cov: 11709 ft: 13788 corp: 17/251b lim: 35 exec/s: 0 rss: 69Mb L: 22/27 MS: 1 ChangeBit- 00:07:12.821 [2024-07-21 11:30:42.175928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.175953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.821 [2024-07-21 11:30:42.176011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.176024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.821 [2024-07-21 11:30:42.176096] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:7 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.821 [2024-07-21 11:30:42.176110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.821 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:12.822 #48 NEW cov: 11732 ft: 14149 corp: 18/281b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 PersAutoDict- DE: "\006\020\306\003|\217/\000"- 00:07:12.822 [2024-07-21 11:30:42.216065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.822 [2024-07-21 11:30:42.216090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.822 [2024-07-21 11:30:42.216149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.822 [2024-07-21 11:30:42.216162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.822 [2024-07-21 11:30:42.216221] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.822 [2024-07-21 11:30:42.216235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.822 #49 NEW cov: 11732 ft: 14190 corp: 19/311b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 CrossOver- 00:07:13.080 [2024-07-21 11:30:42.255718] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:4 cdw10:0000040d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.080 [2024-07-21 11:30:42.255743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.080 #55 NEW cov: 11732 ft: 14209 corp: 20/324b lim: 35 exec/s: 55 rss: 69Mb L: 13/30 MS: 1 ChangeBit- 00:07:13.080 [2024-07-21 11:30:42.295858] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.080 [2024-07-21 11:30:42.295886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.081 #56 NEW cov: 11732 ft: 14221 corp: 21/334b lim: 35 exec/s: 56 rss: 69Mb L: 10/30 MS: 1 InsertByte- 00:07:13.081 [2024-07-21 11:30:42.336381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.336406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.336468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.336482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.336537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.336551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.081 #57 NEW cov: 11732 ft: 14244 corp: 22/366b lim: 35 exec/s: 57 rss: 69Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:13.081 [2024-07-21 11:30:42.376543] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.376568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.376621] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.376634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.376686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:7 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.376699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.081 #58 NEW cov: 11732 ft: 14262 corp: 23/396b lim: 35 exec/s: 58 rss: 69Mb L: 30/32 MS: 1 ChangeByte- 00:07:13.081 [2024-07-21 11:30:42.416674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.416699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.416756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.416770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.416828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000000c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.416842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.081 #59 NEW cov: 11732 ft: 14286 corp: 24/426b lim: 35 exec/s: 59 rss: 70Mb L: 30/32 MS: 1 ChangeBinInt- 00:07:13.081 [2024-07-21 11:30:42.456679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.456704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.456759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.456773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.081 #60 NEW cov: 11732 ft: 14296 corp: 25/453b lim: 35 exec/s: 60 rss: 70Mb L: 27/32 MS: 1 ChangeBit- 00:07:13.081 [2024-07-21 11:30:42.497035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000498 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.497061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.497117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.497131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.497187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.497200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.081 [2024-07-21 11:30:42.497254] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.081 [2024-07-21 11:30:42.497268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:13.339 #61 NEW cov: 11732 ft: 14474 corp: 26/488b lim: 35 exec/s: 61 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:13.339 [2024-07-21 11:30:42.536874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.536900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.536954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES LBA RANGE TYPE cid:5 cdw10:00000303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.536967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.537026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000043d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.537039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.340 NEW_FUNC[1/1]: 0x4ced50 in feat_lba_range_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:289 00:07:13.340 #62 NEW cov: 11743 ft: 14514 corp: 27/509b lim: 35 exec/s: 62 rss: 70Mb L: 21/35 MS: 1 PersAutoDict- DE: "\006\020\306\003|\217/\000"- 00:07:13.340 [2024-07-21 11:30:42.577099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.577125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.577183] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000423 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.577197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.577253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.577267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.340 #63 NEW cov: 11743 ft: 14532 corp: 28/537b lim: 35 exec/s: 63 rss: 70Mb L: 28/35 MS: 1 InsertByte- 00:07:13.340 [2024-07-21 11:30:42.616813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.616842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.340 #64 NEW cov: 11743 ft: 14584 corp: 29/547b lim: 35 exec/s: 64 rss: 70Mb L: 10/35 MS: 1 ChangeByte- 00:07:13.340 #65 NEW cov: 11743 ft: 14598 corp: 30/554b lim: 35 exec/s: 65 rss: 70Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:13.340 #66 NEW cov: 11743 ft: 14612 corp: 31/561b lim: 35 exec/s: 66 rss: 70Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:13.340 [2024-07-21 11:30:42.727485] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.727511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.727587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.727601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.727659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.727673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.340 [2024-07-21 11:30:42.727732] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.340 [2024-07-21 11:30:42.727745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.340 #67 NEW cov: 11743 ft: 14648 corp: 32/593b lim: 35 exec/s: 67 rss: 70Mb L: 32/35 MS: 1 CrossOver- 00:07:13.598 [2024-07-21 11:30:42.767245] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.598 [2024-07-21 11:30:42.767270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.598 #68 NEW cov: 11743 ft: 14662 corp: 33/603b lim: 35 exec/s: 68 rss: 70Mb L: 10/35 MS: 1 CrossOver- 00:07:13.599 [2024-07-21 11:30:42.807357] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000090 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.807383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.599 #69 NEW cov: 11743 ft: 14697 corp: 34/610b lim: 35 exec/s: 69 rss: 70Mb L: 7/35 MS: 1 ShuffleBytes- 00:07:13.599 [2024-07-21 11:30:42.837834] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.837861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.837940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.837954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.838012] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.838025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.838080] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.838094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.599 #70 NEW cov: 11743 ft: 14724 corp: 35/640b lim: 35 exec/s: 70 rss: 70Mb L: 30/35 MS: 1 EraseBytes- 00:07:13.599 #71 NEW cov: 11743 ft: 14791 corp: 36/647b lim: 35 exec/s: 71 rss: 70Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:13.599 [2024-07-21 11:30:42.917917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:4 cdw10:0000040d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.917943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.918002] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.918016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.918073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.918086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.599 #72 NEW cov: 11743 ft: 14805 corp: 37/672b lim: 35 exec/s: 72 rss: 70Mb L: 25/35 MS: 1 CopyPart- 00:07:13.599 [2024-07-21 11:30:42.958133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.958159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.958219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.958232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.958290] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.958304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.958362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.958375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.599 #73 NEW cov: 11743 ft: 14813 corp: 38/704b lim: 35 exec/s: 73 rss: 70Mb L: 32/35 MS: 1 CopyPart- 00:07:13.599 [2024-07-21 11:30:42.998137] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000324 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.998163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.998221] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000007a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.998234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.599 [2024-07-21 11:30:42.998306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000037a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.599 [2024-07-21 11:30:42.998320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.857 #74 NEW cov: 11743 ft: 14849 corp: 39/730b lim: 35 exec/s: 74 rss: 70Mb L: 26/35 MS: 1 CMP- DE: "\001\003"- 00:07:13.857 [2024-07-21 11:30:43.038340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.857 [2024-07-21 11:30:43.038366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.857 [2024-07-21 11:30:43.038425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.857 [2024-07-21 11:30:43.038447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.857 #75 NEW cov: 11743 ft: 14854 corp: 40/757b lim: 35 exec/s: 75 rss: 70Mb L: 27/35 MS: 1 ShuffleBytes- 00:07:13.857 [2024-07-21 11:30:43.078586] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.857 [2024-07-21 11:30:43.078611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.857 [2024-07-21 11:30:43.078670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.078684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.858 [2024-07-21 11:30:43.078738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:7 cdw10:00000410 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.078751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.858 #76 NEW cov: 11743 ft: 14861 corp: 41/791b lim: 35 exec/s: 76 rss: 70Mb L: 34/35 MS: 1 CopyPart- 00:07:13.858 #77 NEW cov: 11743 ft: 14868 corp: 42/798b lim: 35 exec/s: 77 rss: 70Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:13.858 [2024-07-21 11:30:43.148724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.148750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.858 [2024-07-21 11:30:43.148810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.148823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.858 [2024-07-21 11:30:43.148880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.148894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.858 [2024-07-21 11:30:43.148953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000490 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.148966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.858 #78 NEW cov: 11743 ft: 14874 corp: 43/830b lim: 35 exec/s: 78 rss: 70Mb L: 32/35 MS: 1 ChangeBit- 00:07:13.858 [2024-07-21 11:30:43.188502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.188527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.858 [2024-07-21 11:30:43.228612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.228636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.858 #80 NEW cov: 11743 ft: 14882 corp: 44/840b lim: 35 exec/s: 80 rss: 70Mb L: 10/35 MS: 2 ChangeByte-PersAutoDict- DE: "\006\020\306\003|\217/\000"- 00:07:13.858 [2024-07-21 11:30:43.268853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.268879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.858 [2024-07-21 11:30:43.268936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000610 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.858 [2024-07-21 11:30:43.268954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.116 #86 NEW cov: 11743 ft: 14890 corp: 45/858b lim: 35 exec/s: 43 rss: 70Mb L: 18/35 MS: 1 PersAutoDict- DE: "\006\020\306\003|\217/\000"- 00:07:14.116 #86 DONE cov: 11743 ft: 14890 corp: 45/858b lim: 35 exec/s: 43 rss: 70Mb 00:07:14.116 ###### Recommended dictionary. ###### 00:07:14.116 "\006\020\306\003|\217/\000" # Uses: 4 00:07:14.116 "\001\003" # Uses: 0 00:07:14.116 ###### End of recommended dictionary. ###### 00:07:14.116 Done 86 runs in 2 second(s) 00:07:14.116 11:30:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:14.116 11:30:43 -- ../common.sh@72 -- # (( i++ )) 00:07:14.116 11:30:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:14.116 11:30:43 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:14.116 11:30:43 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:14.116 11:30:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:14.116 11:30:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:14.116 11:30:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:14.116 11:30:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:14.116 11:30:43 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:14.116 11:30:43 -- nvmf/run.sh@29 -- # port=4416 00:07:14.116 11:30:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:14.116 11:30:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:14.116 11:30:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:14.116 11:30:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:14.116 [2024-07-21 11:30:43.443023] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:14.116 [2024-07-21 11:30:43.443088] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2066429 ] 00:07:14.116 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.374 [2024-07-21 11:30:43.617747] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.374 [2024-07-21 11:30:43.638110] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:14.374 [2024-07-21 11:30:43.638255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.374 [2024-07-21 11:30:43.690067] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:14.374 [2024-07-21 11:30:43.706423] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:14.374 INFO: Running with entropic power schedule (0xFF, 100). 00:07:14.374 INFO: Seed: 2796092835 00:07:14.374 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:14.374 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:14.374 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:14.374 INFO: A corpus is not provided, starting from an empty corpus 00:07:14.374 #2 INITED exec/s: 0 rss: 60Mb 00:07:14.374 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:14.374 This may also happen if the target rejected all inputs we tried so far 00:07:14.374 [2024-07-21 11:30:43.772340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.374 [2024-07-21 11:30:43.772376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:14.374 [2024-07-21 11:30:43.772512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.374 [2024-07-21 11:30:43.772537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:14.889 NEW_FUNC[1/671]: 0x4b5130 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:14.889 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:14.889 #6 NEW cov: 11594 ft: 11594 corp: 2/49b lim: 105 exec/s: 0 rss: 68Mb L: 48/48 MS: 4 CopyPart-CMP-CMP-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377\024"-"\001\000"- 00:07:14.889 [2024-07-21 11:30:44.103194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.103247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:14.889 [2024-07-21 11:30:44.103395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.103423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:14.889 #7 NEW cov: 11707 ft: 12221 corp: 3/97b lim: 105 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 CrossOver- 00:07:14.889 [2024-07-21 11:30:44.153135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.153167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:14.889 [2024-07-21 11:30:44.153299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.153318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:14.889 #8 NEW cov: 11713 ft: 12392 corp: 4/145b lim: 105 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 ChangeBinInt- 00:07:14.889 [2024-07-21 11:30:44.193162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.193197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:14.889 [2024-07-21 11:30:44.193315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.193340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:14.889 #9 NEW cov: 11798 ft: 12640 corp: 5/193b lim: 105 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 ChangeByte- 00:07:14.889 [2024-07-21 11:30:44.233474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.889 [2024-07-21 11:30:44.233507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:14.890 [2024-07-21 11:30:44.233614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268051505748851515 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.890 [2024-07-21 11:30:44.233632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:14.890 #10 NEW cov: 11798 ft: 12729 corp: 6/241b lim: 105 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 ChangeByte- 00:07:14.890 [2024-07-21 11:30:44.273576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.890 [2024-07-21 11:30:44.273606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:14.890 [2024-07-21 11:30:44.273730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268051505748851515 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.890 [2024-07-21 11:30:44.273751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:14.890 #11 NEW cov: 11798 ft: 12755 corp: 7/291b lim: 105 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:14.890 [2024-07-21 11:30:44.313568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.890 [2024-07-21 11:30:44.313595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 #12 NEW cov: 11798 ft: 13232 corp: 8/329b lim: 105 exec/s: 0 rss: 69Mb L: 38/50 MS: 1 EraseBytes- 00:07:15.148 [2024-07-21 11:30:44.353554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268017163190352699 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.353585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 [2024-07-21 11:30:44.353718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.353738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.148 #13 NEW cov: 11798 ft: 13349 corp: 9/383b lim: 105 exec/s: 0 rss: 69Mb L: 54/54 MS: 1 CopyPart- 00:07:15.148 [2024-07-21 11:30:44.404021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268017163190352699 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.404056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 [2024-07-21 11:30:44.404193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.404217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.148 #14 NEW cov: 11798 ft: 13481 corp: 10/437b lim: 105 exec/s: 0 rss: 69Mb L: 54/54 MS: 1 ChangeByte- 00:07:15.148 [2024-07-21 11:30:44.444112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.444140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 [2024-07-21 11:30:44.444267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.444291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.148 #15 NEW cov: 11798 ft: 13527 corp: 11/486b lim: 105 exec/s: 0 rss: 69Mb L: 49/54 MS: 1 CrossOver- 00:07:15.148 [2024-07-21 11:30:44.483902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268017163190352699 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.483936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 [2024-07-21 11:30:44.484073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.484096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.148 #16 NEW cov: 11798 ft: 13551 corp: 12/540b lim: 105 exec/s: 0 rss: 69Mb L: 54/54 MS: 1 ChangeByte- 00:07:15.148 [2024-07-21 11:30:44.524312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.524343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 [2024-07-21 11:30:44.524467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.524488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.148 #17 NEW cov: 11798 ft: 13588 corp: 13/589b lim: 105 exec/s: 0 rss: 69Mb L: 49/54 MS: 1 InsertByte- 00:07:15.148 [2024-07-21 11:30:44.564121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.564152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.148 [2024-07-21 11:30:44.564278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.148 [2024-07-21 11:30:44.564301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.406 #18 NEW cov: 11798 ft: 13723 corp: 14/637b lim: 105 exec/s: 0 rss: 69Mb L: 48/54 MS: 1 ShuffleBytes- 00:07:15.406 [2024-07-21 11:30:44.604501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.406 [2024-07-21 11:30:44.604531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.406 #19 NEW cov: 11798 ft: 13752 corp: 15/675b lim: 105 exec/s: 0 rss: 69Mb L: 38/54 MS: 1 ShuffleBytes- 00:07:15.406 [2024-07-21 11:30:44.644502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.406 [2024-07-21 11:30:44.644529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.406 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:15.406 #20 NEW cov: 11821 ft: 13791 corp: 16/713b lim: 105 exec/s: 0 rss: 69Mb L: 38/54 MS: 1 CopyPart- 00:07:15.406 [2024-07-21 11:30:44.694425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.406 [2024-07-21 11:30:44.694460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.406 #21 NEW cov: 11821 ft: 13879 corp: 17/748b lim: 105 exec/s: 0 rss: 69Mb L: 35/54 MS: 1 EraseBytes- 00:07:15.406 [2024-07-21 11:30:44.755126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.406 [2024-07-21 11:30:44.755157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.406 [2024-07-21 11:30:44.755252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.406 [2024-07-21 11:30:44.755274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.406 #22 NEW cov: 11821 ft: 13973 corp: 18/797b lim: 105 exec/s: 22 rss: 70Mb L: 49/54 MS: 1 CopyPart- 00:07:15.406 [2024-07-21 11:30:44.794753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069615910911 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.406 [2024-07-21 11:30:44.794783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.406 #23 NEW cov: 11821 ft: 14050 corp: 19/835b lim: 105 exec/s: 23 rss: 70Mb L: 38/54 MS: 1 ChangeBit- 00:07:15.665 [2024-07-21 11:30:44.845058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15132 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.845090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.665 #24 NEW cov: 11821 ft: 14079 corp: 20/873b lim: 105 exec/s: 24 rss: 70Mb L: 38/54 MS: 1 ChangeBit- 00:07:15.665 [2024-07-21 11:30:44.884782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.884809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.665 #25 NEW cov: 11821 ft: 14085 corp: 21/908b lim: 105 exec/s: 25 rss: 70Mb L: 35/54 MS: 1 EraseBytes- 00:07:15.665 [2024-07-21 11:30:44.935160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268017163190352699 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.935195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:44.935316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.935337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.665 #26 NEW cov: 11821 ft: 14128 corp: 22/962b lim: 105 exec/s: 26 rss: 70Mb L: 54/54 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:15.665 [2024-07-21 11:30:44.975991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2025524839466146844 len:7197 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.976023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:44.976093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.976113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:44.976243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.976263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:44.976384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:44.976408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:15.665 #28 NEW cov: 11821 ft: 14653 corp: 23/1047b lim: 105 exec/s: 28 rss: 70Mb L: 85/85 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:15.665 [2024-07-21 11:30:45.015521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:45.015553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:45.015677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:45.015703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:45.015820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3475436663029316411 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:45.015837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:15.665 #29 NEW cov: 11821 ft: 14970 corp: 24/1128b lim: 105 exec/s: 29 rss: 70Mb L: 81/85 MS: 1 InsertRepeatedBytes- 00:07:15.665 [2024-07-21 11:30:45.055776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:45.055811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.665 [2024-07-21 11:30:45.055924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.665 [2024-07-21 11:30:45.055947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.665 #30 NEW cov: 11821 ft: 14988 corp: 25/1177b lim: 105 exec/s: 30 rss: 70Mb L: 49/85 MS: 1 InsertByte- 00:07:15.924 [2024-07-21 11:30:45.095975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268017163190352699 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.096008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.924 [2024-07-21 11:30:45.096114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.096134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.924 #31 NEW cov: 11821 ft: 15040 corp: 26/1231b lim: 105 exec/s: 31 rss: 70Mb L: 54/85 MS: 1 ShuffleBytes- 00:07:15.924 [2024-07-21 11:30:45.136457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.136489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.924 [2024-07-21 11:30:45.136556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.136579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.924 [2024-07-21 11:30:45.136700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.136719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:15.924 [2024-07-21 11:30:45.136848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4268070197446523707 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.136869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:15.924 #32 NEW cov: 11821 ft: 15071 corp: 27/1322b lim: 105 exec/s: 32 rss: 70Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:07:15.924 [2024-07-21 11:30:45.175892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18391358628071211007 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.175919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.924 #33 NEW cov: 11821 ft: 15078 corp: 28/1363b lim: 105 exec/s: 33 rss: 70Mb L: 41/91 MS: 1 EraseBytes- 00:07:15.924 [2024-07-21 11:30:45.216074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.216105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.924 [2024-07-21 11:30:45.216230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:2816 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.216250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.924 #34 NEW cov: 11821 ft: 15100 corp: 29/1407b lim: 105 exec/s: 34 rss: 70Mb L: 44/91 MS: 1 CrossOver- 00:07:15.924 [2024-07-21 11:30:45.256006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.256039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.924 [2024-07-21 11:30:45.256158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.256180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:15.924 #35 NEW cov: 11821 ft: 15114 corp: 30/1456b lim: 105 exec/s: 35 rss: 70Mb L: 49/91 MS: 1 ShuffleBytes- 00:07:15.924 [2024-07-21 11:30:45.296298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069615910666 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.296324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:15.924 #36 NEW cov: 11821 ft: 15119 corp: 31/1495b lim: 105 exec/s: 36 rss: 70Mb L: 39/91 MS: 1 CrossOver- 00:07:15.924 [2024-07-21 11:30:45.346544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15132 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.924 [2024-07-21 11:30:45.346577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 #37 NEW cov: 11821 ft: 15146 corp: 32/1533b lim: 105 exec/s: 37 rss: 70Mb L: 38/91 MS: 1 CMP- DE: "\377.\217}\331\226\0246"- 00:07:16.183 [2024-07-21 11:30:45.386857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.386890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.387027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:2816 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.387052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.183 #38 NEW cov: 11821 ft: 15162 corp: 33/1577b lim: 105 exec/s: 38 rss: 70Mb L: 44/91 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\024"- 00:07:16.183 [2024-07-21 11:30:45.427078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.427124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.427255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16128 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.427276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.427396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4264973972702706491 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.427415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:16.183 #39 NEW cov: 11821 ft: 15218 corp: 34/1659b lim: 105 exec/s: 39 rss: 70Mb L: 82/91 MS: 1 InsertByte- 00:07:16.183 [2024-07-21 11:30:45.476892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.476918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 #40 NEW cov: 11821 ft: 15246 corp: 35/1697b lim: 105 exec/s: 40 rss: 70Mb L: 38/91 MS: 1 CrossOver- 00:07:16.183 [2024-07-21 11:30:45.517135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.517166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.517279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4308196387977101627 len:51658 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.517298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.183 #41 NEW cov: 11821 ft: 15256 corp: 36/1743b lim: 105 exec/s: 41 rss: 70Mb L: 46/91 MS: 1 InsertRepeatedBytes- 00:07:16.183 [2024-07-21 11:30:45.556824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268071038438030139 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.556850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 #42 NEW cov: 11821 ft: 15278 corp: 37/1766b lim: 105 exec/s: 42 rss: 70Mb L: 23/91 MS: 1 CrossOver- 00:07:16.183 [2024-07-21 11:30:45.597634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14323354218788800198 len:50887 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.597669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.597737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14323354221939181254 len:50887 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.597754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.597878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:14323354221939181254 len:50887 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.597900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:16.183 [2024-07-21 11:30:45.598018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4268070153490217787 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.183 [2024-07-21 11:30:45.598039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:16.442 #43 NEW cov: 11821 ft: 15288 corp: 38/1867b lim: 105 exec/s: 43 rss: 70Mb L: 101/101 MS: 1 InsertRepeatedBytes- 00:07:16.442 [2024-07-21 11:30:45.637527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4268017163190352699 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.637560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.442 [2024-07-21 11:30:45.637677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446851387 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.637703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.442 #44 NEW cov: 11821 ft: 15329 corp: 39/1921b lim: 105 exec/s: 44 rss: 70Mb L: 54/101 MS: 1 ChangeBinInt- 00:07:16.442 [2024-07-21 11:30:45.677699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743227785543679 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.677732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.442 [2024-07-21 11:30:45.677848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.677867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.442 #45 NEW cov: 11821 ft: 15339 corp: 40/1969b lim: 105 exec/s: 45 rss: 70Mb L: 48/101 MS: 1 ChangeBit- 00:07:16.442 [2024-07-21 11:30:45.717853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.717885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.442 [2024-07-21 11:30:45.717999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.718019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.442 #46 NEW cov: 11821 ft: 15354 corp: 41/2031b lim: 105 exec/s: 46 rss: 70Mb L: 62/101 MS: 1 CrossOver- 00:07:16.442 [2024-07-21 11:30:45.757888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446527723506499509 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.757919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.442 [2024-07-21 11:30:45.758025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4268070150201883451 len:15164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.442 [2024-07-21 11:30:45.758043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.442 #47 NEW cov: 11821 ft: 15357 corp: 42/2073b lim: 105 exec/s: 23 rss: 71Mb L: 42/101 MS: 1 InsertByte- 00:07:16.442 #47 DONE cov: 11821 ft: 15357 corp: 42/2073b lim: 105 exec/s: 23 rss: 71Mb 00:07:16.442 ###### Recommended dictionary. ###### 00:07:16.442 "\377\377\377\377\377\377\377\024" # Uses: 1 00:07:16.442 "\001\000" # Uses: 1 00:07:16.442 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:16.442 "\377.\217}\331\226\0246" # Uses: 0 00:07:16.442 ###### End of recommended dictionary. ###### 00:07:16.442 Done 47 runs in 2 second(s) 00:07:16.701 11:30:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:16.701 11:30:45 -- ../common.sh@72 -- # (( i++ )) 00:07:16.701 11:30:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:16.701 11:30:45 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:16.701 11:30:45 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:16.701 11:30:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:16.701 11:30:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:16.701 11:30:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:16.701 11:30:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:16.701 11:30:45 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:16.701 11:30:45 -- nvmf/run.sh@29 -- # port=4417 00:07:16.701 11:30:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:16.701 11:30:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:16.701 11:30:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:16.701 11:30:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:16.701 [2024-07-21 11:30:45.944735] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:16.701 [2024-07-21 11:30:45.944828] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2066757 ] 00:07:16.701 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.701 [2024-07-21 11:30:46.123804] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.959 [2024-07-21 11:30:46.144228] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:16.959 [2024-07-21 11:30:46.144351] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.959 [2024-07-21 11:30:46.195881] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.959 [2024-07-21 11:30:46.212225] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:16.959 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.959 INFO: Seed: 1008131267 00:07:16.959 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:16.959 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:16.959 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:16.959 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.959 #2 INITED exec/s: 0 rss: 60Mb 00:07:16.959 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:16.959 This may also happen if the target rejected all inputs we tried so far 00:07:16.959 [2024-07-21 11:30:46.278455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.959 [2024-07-21 11:30:46.278490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.959 [2024-07-21 11:30:46.278611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.959 [2024-07-21 11:30:46.278633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.959 [2024-07-21 11:30:46.278751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.959 [2024-07-21 11:30:46.278774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.218 NEW_FUNC[1/672]: 0x4b8420 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:17.218 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:17.218 #16 NEW cov: 11613 ft: 11614 corp: 2/76b lim: 120 exec/s: 0 rss: 68Mb L: 75/75 MS: 4 ChangeByte-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:07:17.218 [2024-07-21 11:30:46.609398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.218 [2024-07-21 11:30:46.609462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.218 [2024-07-21 11:30:46.609595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.218 [2024-07-21 11:30:46.609634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.218 [2024-07-21 11:30:46.609767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.218 [2024-07-21 11:30:46.609800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.218 #22 NEW cov: 11728 ft: 12228 corp: 3/155b lim: 120 exec/s: 0 rss: 68Mb L: 79/79 MS: 1 CMP- DE: "\001\000\000\007"- 00:07:17.477 [2024-07-21 11:30:46.659339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.477 [2024-07-21 11:30:46.659374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.477 [2024-07-21 11:30:46.659504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.477 [2024-07-21 11:30:46.659525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.477 [2024-07-21 11:30:46.659654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.659673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.478 #23 NEW cov: 11734 ft: 12676 corp: 4/230b lim: 120 exec/s: 0 rss: 68Mb L: 75/79 MS: 1 ChangeBit- 00:07:17.478 [2024-07-21 11:30:46.699433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.699472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.699599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18302628885633695488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.699632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.699753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.699776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.478 #24 NEW cov: 11819 ft: 13006 corp: 5/305b lim: 120 exec/s: 0 rss: 68Mb L: 75/79 MS: 1 ChangeBinInt- 00:07:17.478 [2024-07-21 11:30:46.739640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.739675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.739803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.739822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.739944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.739965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.478 #25 NEW cov: 11819 ft: 13053 corp: 6/384b lim: 120 exec/s: 0 rss: 68Mb L: 79/79 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\002"- 00:07:17.478 [2024-07-21 11:30:46.779808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.779843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.779957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.779980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.780098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.780117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.478 #26 NEW cov: 11819 ft: 13126 corp: 7/473b lim: 120 exec/s: 0 rss: 68Mb L: 89/89 MS: 1 CrossOver- 00:07:17.478 [2024-07-21 11:30:46.819867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.819898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.820022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.820047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.820166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.820188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.478 #27 NEW cov: 11819 ft: 13182 corp: 8/552b lim: 120 exec/s: 0 rss: 68Mb L: 79/89 MS: 1 ChangeBinInt- 00:07:17.478 [2024-07-21 11:30:46.860022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.860053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.860183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18302628885633695488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.860203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.860340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.860363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.478 #28 NEW cov: 11819 ft: 13278 corp: 9/627b lim: 120 exec/s: 0 rss: 69Mb L: 75/89 MS: 1 ChangeByte- 00:07:17.478 [2024-07-21 11:30:46.900124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.900155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.900248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.900271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.478 [2024-07-21 11:30:46.900395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.478 [2024-07-21 11:30:46.900416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.737 #29 NEW cov: 11819 ft: 13296 corp: 10/706b lim: 120 exec/s: 0 rss: 69Mb L: 79/89 MS: 1 ChangeBit- 00:07:17.737 [2024-07-21 11:30:46.940434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.940472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:46.940540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.940564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:46.940695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.940716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:46.940840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.940866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:17.737 #30 NEW cov: 11819 ft: 13667 corp: 11/822b lim: 120 exec/s: 0 rss: 69Mb L: 116/116 MS: 1 InsertRepeatedBytes- 00:07:17.737 [2024-07-21 11:30:46.980360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.980391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:46.980513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.980536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:46.980669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:46.980689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.737 #31 NEW cov: 11819 ft: 13736 corp: 12/897b lim: 120 exec/s: 0 rss: 69Mb L: 75/116 MS: 1 ChangeBinInt- 00:07:17.737 [2024-07-21 11:30:47.020437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.020475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:47.020567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18374686479671623679 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.020587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:47.020716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.020739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.737 #32 NEW cov: 11819 ft: 13756 corp: 13/980b lim: 120 exec/s: 0 rss: 69Mb L: 83/116 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:07:17.737 [2024-07-21 11:30:47.060035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.060059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.737 #33 NEW cov: 11819 ft: 14628 corp: 14/1021b lim: 120 exec/s: 0 rss: 69Mb L: 41/116 MS: 1 CrossOver- 00:07:17.737 [2024-07-21 11:30:47.100729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.100760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:47.100838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.100861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.737 [2024-07-21 11:30:47.100986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1099511627530 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.737 [2024-07-21 11:30:47.101003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.738 #34 NEW cov: 11819 ft: 14672 corp: 15/1110b lim: 120 exec/s: 0 rss: 69Mb L: 89/116 MS: 1 ChangeBinInt- 00:07:17.738 [2024-07-21 11:30:47.150870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743751587004415 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.738 [2024-07-21 11:30:47.150906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.738 [2024-07-21 11:30:47.151025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18302628885633695488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.738 [2024-07-21 11:30:47.151045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.738 [2024-07-21 11:30:47.151167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.738 [2024-07-21 11:30:47.151190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.997 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:17.997 #35 NEW cov: 11842 ft: 14771 corp: 16/1185b lim: 120 exec/s: 0 rss: 69Mb L: 75/116 MS: 1 ChangeByte- 00:07:17.997 [2024-07-21 11:30:47.191102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.191129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.191262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446463698261246207 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.191284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.191411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.191434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.997 #36 NEW cov: 11842 ft: 14810 corp: 17/1264b lim: 120 exec/s: 0 rss: 69Mb L: 79/116 MS: 1 CopyPart- 00:07:17.997 [2024-07-21 11:30:47.231209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.231240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.231369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18374686479671623679 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.231396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.231525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.231544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.997 #37 NEW cov: 11842 ft: 14816 corp: 18/1347b lim: 120 exec/s: 37 rss: 69Mb L: 83/116 MS: 1 CopyPart- 00:07:17.997 [2024-07-21 11:30:47.281570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.281603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.281679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.281698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.281812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:281474976710401 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.281834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.281960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.281981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:17.997 #38 NEW cov: 11842 ft: 14838 corp: 19/1453b lim: 120 exec/s: 38 rss: 69Mb L: 106/116 MS: 1 CrossOver- 00:07:17.997 [2024-07-21 11:30:47.321407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.321440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.321547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446463698261246207 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.321571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.321699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.321719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.997 #39 NEW cov: 11842 ft: 14841 corp: 20/1532b lim: 120 exec/s: 39 rss: 69Mb L: 79/116 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\002"- 00:07:17.997 [2024-07-21 11:30:47.361273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.361303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.361424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.361447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.997 #40 NEW cov: 11842 ft: 15152 corp: 21/1598b lim: 120 exec/s: 40 rss: 69Mb L: 66/116 MS: 1 EraseBytes- 00:07:17.997 [2024-07-21 11:30:47.411426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.411462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.997 [2024-07-21 11:30:47.411554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744071897612287 len:65024 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.997 [2024-07-21 11:30:47.411573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.256 #41 NEW cov: 11842 ft: 15168 corp: 22/1664b lim: 120 exec/s: 41 rss: 69Mb L: 66/116 MS: 1 CopyPart- 00:07:18.256 [2024-07-21 11:30:47.461542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.461577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.256 [2024-07-21 11:30:47.461683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446743012852629503 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.461706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.256 #42 NEW cov: 11842 ft: 15193 corp: 23/1730b lim: 120 exec/s: 42 rss: 70Mb L: 66/116 MS: 1 ChangeBinInt- 00:07:18.256 [2024-07-21 11:30:47.501639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.501670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.256 [2024-07-21 11:30:47.501787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.501813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.256 #43 NEW cov: 11842 ft: 15213 corp: 24/1796b lim: 120 exec/s: 43 rss: 70Mb L: 66/116 MS: 1 CMP- DE: "\001\000\000\000\000\000\004\000"- 00:07:18.256 [2024-07-21 11:30:47.541818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.541853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.256 [2024-07-21 11:30:47.541982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446742974197923839 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.542004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.256 #44 NEW cov: 11842 ft: 15220 corp: 25/1846b lim: 120 exec/s: 44 rss: 70Mb L: 50/116 MS: 1 InsertRepeatedBytes- 00:07:18.256 [2024-07-21 11:30:47.592233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743176061386751 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.592270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.256 [2024-07-21 11:30:47.592377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18302628885633695488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.592399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.256 [2024-07-21 11:30:47.592509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.592532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.256 #45 NEW cov: 11842 ft: 15221 corp: 26/1921b lim: 120 exec/s: 45 rss: 70Mb L: 75/116 MS: 1 ChangeByte- 00:07:18.256 [2024-07-21 11:30:47.642350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.256 [2024-07-21 11:30:47.642382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.257 [2024-07-21 11:30:47.642446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18302628885633695488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.257 [2024-07-21 11:30:47.642466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.257 [2024-07-21 11:30:47.642582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744069610885375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.257 [2024-07-21 11:30:47.642605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.257 #46 NEW cov: 11842 ft: 15266 corp: 27/1996b lim: 120 exec/s: 46 rss: 70Mb L: 75/116 MS: 1 CMP- DE: "\001/\217\177/\013\263P"- 00:07:18.516 [2024-07-21 11:30:47.682476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.682525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.682610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.682633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.682756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.682779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.516 #47 NEW cov: 11842 ft: 15273 corp: 28/2075b lim: 120 exec/s: 47 rss: 70Mb L: 79/116 MS: 1 ShuffleBytes- 00:07:18.516 [2024-07-21 11:30:47.722891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.722924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.722978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.722995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.723107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:281474976710401 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.723132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.723246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.723267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:18.516 #48 NEW cov: 11842 ft: 15344 corp: 29/2181b lim: 120 exec/s: 48 rss: 70Mb L: 106/116 MS: 1 PersAutoDict- DE: "\001\000\000\007"- 00:07:18.516 [2024-07-21 11:30:47.772757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.772793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.772905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446742978492891135 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.772930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.773055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.773077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.516 #49 NEW cov: 11842 ft: 15358 corp: 30/2260b lim: 120 exec/s: 49 rss: 70Mb L: 79/116 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:18.516 [2024-07-21 11:30:47.812484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.812515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.812611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446743012852629503 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.812637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.516 #50 NEW cov: 11842 ft: 15383 corp: 31/2326b lim: 120 exec/s: 50 rss: 70Mb L: 66/116 MS: 1 ShuffleBytes- 00:07:18.516 [2024-07-21 11:30:47.862912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.862945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.863053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.863073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.863195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.863217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.516 #51 NEW cov: 11842 ft: 15387 corp: 32/2405b lim: 120 exec/s: 51 rss: 70Mb L: 79/116 MS: 1 ChangeByte- 00:07:18.516 [2024-07-21 11:30:47.913202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.913234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.913339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.913364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.516 [2024-07-21 11:30:47.913486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:3617008641903833650 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.516 [2024-07-21 11:30:47.913510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.516 #52 NEW cov: 11842 ft: 15408 corp: 33/2499b lim: 120 exec/s: 52 rss: 70Mb L: 94/116 MS: 1 InsertRepeatedBytes- 00:07:18.776 [2024-07-21 11:30:47.953523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.953557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:47.953642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.953663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:47.953781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:281474976710401 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.953797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:47.953921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.953940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:18.776 #53 NEW cov: 11842 ft: 15415 corp: 34/2605b lim: 120 exec/s: 53 rss: 70Mb L: 106/116 MS: 1 ShuffleBytes- 00:07:18.776 [2024-07-21 11:30:47.993372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.993404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:47.993479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.993497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:47.993614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744070085672959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:47.993637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.776 #54 NEW cov: 11842 ft: 15439 corp: 35/2685b lim: 120 exec/s: 54 rss: 70Mb L: 80/116 MS: 1 InsertByte- 00:07:18.776 [2024-07-21 11:30:48.033705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.033738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.033793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.033818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.033935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.033957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.034080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.034100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:18.776 #56 NEW cov: 11842 ft: 15452 corp: 36/2781b lim: 120 exec/s: 56 rss: 70Mb L: 96/116 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:18.776 [2024-07-21 11:30:48.073557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.073586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.073656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069431361543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.073675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.073797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.073819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.776 #57 NEW cov: 11842 ft: 15455 corp: 37/2860b lim: 120 exec/s: 57 rss: 70Mb L: 79/116 MS: 1 ChangeBinInt- 00:07:18.776 [2024-07-21 11:30:48.113751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:71494644084441088 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.113783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.113845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18302628885633695488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.113866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.113993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.114015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.776 #58 NEW cov: 11842 ft: 15472 corp: 38/2935b lim: 120 exec/s: 58 rss: 70Mb L: 75/116 MS: 1 ChangeBinInt- 00:07:18.776 [2024-07-21 11:30:48.153828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.153859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.153917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:72057624102762239 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.153935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.154058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.154083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.776 #59 NEW cov: 11842 ft: 15480 corp: 39/3018b lim: 120 exec/s: 59 rss: 70Mb L: 83/116 MS: 1 CMP- DE: "\000\000\000\366"- 00:07:18.776 [2024-07-21 11:30:48.193745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.193777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.776 [2024-07-21 11:30:48.193887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744071897612287 len:65024 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.776 [2024-07-21 11:30:48.193906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.036 #60 NEW cov: 11842 ft: 15481 corp: 40/3084b lim: 120 exec/s: 60 rss: 70Mb L: 66/116 MS: 1 CMP- DE: "0\3149\201\177\217/\000"- 00:07:19.036 [2024-07-21 11:30:48.233838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.036 [2024-07-21 11:30:48.233872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.036 [2024-07-21 11:30:48.234017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446742974197923839 len:65448 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.036 [2024-07-21 11:30:48.234042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.036 #61 NEW cov: 11842 ft: 15491 corp: 41/3135b lim: 120 exec/s: 30 rss: 70Mb L: 51/116 MS: 1 InsertByte- 00:07:19.036 #61 DONE cov: 11842 ft: 15491 corp: 41/3135b lim: 120 exec/s: 30 rss: 70Mb 00:07:19.036 ###### Recommended dictionary. ###### 00:07:19.036 "\001\000\000\007" # Uses: 1 00:07:19.036 "\001\000\000\000\000\000\000\002" # Uses: 2 00:07:19.036 "\001\000\000\000\000\000\004\000" # Uses: 0 00:07:19.036 "\001/\217\177/\013\263P" # Uses: 0 00:07:19.036 "\000\000\000\000" # Uses: 0 00:07:19.036 "\000\000\000\366" # Uses: 0 00:07:19.036 "0\3149\201\177\217/\000" # Uses: 0 00:07:19.036 ###### End of recommended dictionary. ###### 00:07:19.036 Done 61 runs in 2 second(s) 00:07:19.036 11:30:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:19.036 11:30:48 -- ../common.sh@72 -- # (( i++ )) 00:07:19.036 11:30:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:19.036 11:30:48 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:19.036 11:30:48 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:19.036 11:30:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:19.036 11:30:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:19.036 11:30:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:19.036 11:30:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:19.036 11:30:48 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:19.036 11:30:48 -- nvmf/run.sh@29 -- # port=4418 00:07:19.036 11:30:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:19.036 11:30:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:19.036 11:30:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:19.036 11:30:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:19.036 [2024-07-21 11:30:48.418652] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:19.036 [2024-07-21 11:30:48.418736] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2067267 ] 00:07:19.036 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.296 [2024-07-21 11:30:48.595447] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.296 [2024-07-21 11:30:48.615232] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:19.296 [2024-07-21 11:30:48.615361] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.296 [2024-07-21 11:30:48.666912] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:19.296 [2024-07-21 11:30:48.683223] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:19.296 INFO: Running with entropic power schedule (0xFF, 100). 00:07:19.296 INFO: Seed: 3478128123 00:07:19.296 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:19.296 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:19.296 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:19.296 INFO: A corpus is not provided, starting from an empty corpus 00:07:19.296 #2 INITED exec/s: 0 rss: 60Mb 00:07:19.296 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:19.296 This may also happen if the target rejected all inputs we tried so far 00:07:19.554 [2024-07-21 11:30:48.728612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:19.554 [2024-07-21 11:30:48.728642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.554 [2024-07-21 11:30:48.728673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:19.554 [2024-07-21 11:30:48.728685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.554 [2024-07-21 11:30:48.728736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:19.554 [2024-07-21 11:30:48.728751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.554 [2024-07-21 11:30:48.728804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:19.554 [2024-07-21 11:30:48.728818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.813 NEW_FUNC[1/670]: 0x4bbc80 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:19.813 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:19.813 #21 NEW cov: 11559 ft: 11554 corp: 2/82b lim: 100 exec/s: 0 rss: 68Mb L: 81/81 MS: 4 ShuffleBytes-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:19.813 [2024-07-21 11:30:49.039453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:19.813 [2024-07-21 11:30:49.039490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.039542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:19.813 [2024-07-21 11:30:49.039558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.039615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:19.813 [2024-07-21 11:30:49.039632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.039691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:19.813 [2024-07-21 11:30:49.039705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.813 #22 NEW cov: 11672 ft: 12165 corp: 3/163b lim: 100 exec/s: 0 rss: 68Mb L: 81/81 MS: 1 ChangeBinInt- 00:07:19.813 [2024-07-21 11:30:49.089535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:19.813 [2024-07-21 11:30:49.089564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.089601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:19.813 [2024-07-21 11:30:49.089616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.089669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:19.813 [2024-07-21 11:30:49.089683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.089738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:19.813 [2024-07-21 11:30:49.089751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.813 #23 NEW cov: 11678 ft: 12396 corp: 4/248b lim: 100 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 CMP- DE: "\001\000\000\032"- 00:07:19.813 [2024-07-21 11:30:49.129670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:19.813 [2024-07-21 11:30:49.129698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.129733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:19.813 [2024-07-21 11:30:49.129747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.129802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:19.813 [2024-07-21 11:30:49.129817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.129871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:19.813 [2024-07-21 11:30:49.129884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.813 #24 NEW cov: 11763 ft: 12681 corp: 5/333b lim: 100 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 ChangeBinInt- 00:07:19.813 [2024-07-21 11:30:49.169840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:19.813 [2024-07-21 11:30:49.169868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.169918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:19.813 [2024-07-21 11:30:49.169932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.169988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:19.813 [2024-07-21 11:30:49.170004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.170059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:19.813 [2024-07-21 11:30:49.170073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.813 #25 NEW cov: 11763 ft: 12823 corp: 6/427b lim: 100 exec/s: 0 rss: 68Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:07:19.813 [2024-07-21 11:30:49.209816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:19.813 [2024-07-21 11:30:49.209845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.209894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:19.813 [2024-07-21 11:30:49.209909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.813 [2024-07-21 11:30:49.209965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:19.813 [2024-07-21 11:30:49.209981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.813 #27 NEW cov: 11763 ft: 13155 corp: 7/498b lim: 100 exec/s: 0 rss: 68Mb L: 71/94 MS: 2 ChangeBinInt-CrossOver- 00:07:20.072 [2024-07-21 11:30:49.249990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.072 [2024-07-21 11:30:49.250017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.250052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.072 [2024-07-21 11:30:49.250067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.250121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.072 [2024-07-21 11:30:49.250136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.250190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.072 [2024-07-21 11:30:49.250204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.072 #28 NEW cov: 11763 ft: 13279 corp: 8/579b lim: 100 exec/s: 0 rss: 68Mb L: 81/94 MS: 1 ChangeByte- 00:07:20.072 [2024-07-21 11:30:49.290126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.072 [2024-07-21 11:30:49.290155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.290194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.072 [2024-07-21 11:30:49.290206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.290258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.072 [2024-07-21 11:30:49.290273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.290344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.072 [2024-07-21 11:30:49.290359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.072 #33 NEW cov: 11763 ft: 13297 corp: 9/671b lim: 100 exec/s: 0 rss: 68Mb L: 92/94 MS: 5 CopyPart-ChangeByte-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:20.072 [2024-07-21 11:30:49.330094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.072 [2024-07-21 11:30:49.330121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.330156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.072 [2024-07-21 11:30:49.330171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.330225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.072 [2024-07-21 11:30:49.330243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.072 #34 NEW cov: 11763 ft: 13317 corp: 10/749b lim: 100 exec/s: 0 rss: 69Mb L: 78/94 MS: 1 EraseBytes- 00:07:20.072 [2024-07-21 11:30:49.370375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.072 [2024-07-21 11:30:49.370403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.370448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.072 [2024-07-21 11:30:49.370464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.370522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.072 [2024-07-21 11:30:49.370538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.072 [2024-07-21 11:30:49.370593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.072 [2024-07-21 11:30:49.370608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.072 #35 NEW cov: 11763 ft: 13349 corp: 11/834b lim: 100 exec/s: 0 rss: 69Mb L: 85/94 MS: 1 ChangeBit- 00:07:20.073 [2024-07-21 11:30:49.410530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.073 [2024-07-21 11:30:49.410558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.410599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.073 [2024-07-21 11:30:49.410613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.410670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.073 [2024-07-21 11:30:49.410686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.410742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.073 [2024-07-21 11:30:49.410756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.073 #36 NEW cov: 11763 ft: 13376 corp: 12/928b lim: 100 exec/s: 0 rss: 69Mb L: 94/94 MS: 1 ChangeBinInt- 00:07:20.073 [2024-07-21 11:30:49.450582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.073 [2024-07-21 11:30:49.450610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.450644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.073 [2024-07-21 11:30:49.450659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.450712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.073 [2024-07-21 11:30:49.450727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.450780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.073 [2024-07-21 11:30:49.450793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.073 #37 NEW cov: 11763 ft: 13443 corp: 13/1009b lim: 100 exec/s: 0 rss: 69Mb L: 81/94 MS: 1 ChangeBit- 00:07:20.073 [2024-07-21 11:30:49.490690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.073 [2024-07-21 11:30:49.490720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.490755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.073 [2024-07-21 11:30:49.490770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.490826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.073 [2024-07-21 11:30:49.490842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.073 [2024-07-21 11:30:49.490897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.073 [2024-07-21 11:30:49.490911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 #38 NEW cov: 11763 ft: 13506 corp: 14/1094b lim: 100 exec/s: 0 rss: 69Mb L: 85/94 MS: 1 ShuffleBytes- 00:07:20.331 [2024-07-21 11:30:49.530830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.331 [2024-07-21 11:30:49.530858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.530893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.331 [2024-07-21 11:30:49.530909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.530967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.331 [2024-07-21 11:30:49.530982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.531038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.331 [2024-07-21 11:30:49.531053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 #39 NEW cov: 11763 ft: 13523 corp: 15/1179b lim: 100 exec/s: 0 rss: 69Mb L: 85/94 MS: 1 PersAutoDict- DE: "\001\000\000\032"- 00:07:20.331 [2024-07-21 11:30:49.570977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.331 [2024-07-21 11:30:49.571005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.571044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.331 [2024-07-21 11:30:49.571056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.571111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.331 [2024-07-21 11:30:49.571126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.571181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.331 [2024-07-21 11:30:49.571195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 #40 NEW cov: 11763 ft: 13560 corp: 16/1268b lim: 100 exec/s: 0 rss: 69Mb L: 89/94 MS: 1 PersAutoDict- DE: "\001\000\000\032"- 00:07:20.331 [2024-07-21 11:30:49.600997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.331 [2024-07-21 11:30:49.601024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.601068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.331 [2024-07-21 11:30:49.601086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.601141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.331 [2024-07-21 11:30:49.601156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.601210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.331 [2024-07-21 11:30:49.601225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:20.331 #41 NEW cov: 11786 ft: 13570 corp: 17/1353b lim: 100 exec/s: 0 rss: 69Mb L: 85/94 MS: 1 ShuffleBytes- 00:07:20.331 [2024-07-21 11:30:49.641161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.331 [2024-07-21 11:30:49.641189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.641221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.331 [2024-07-21 11:30:49.641236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.641292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.331 [2024-07-21 11:30:49.641304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.641360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.331 [2024-07-21 11:30:49.641373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 #42 NEW cov: 11786 ft: 13638 corp: 18/1439b lim: 100 exec/s: 0 rss: 69Mb L: 86/94 MS: 1 InsertByte- 00:07:20.331 [2024-07-21 11:30:49.681244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.331 [2024-07-21 11:30:49.681272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.681310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.331 [2024-07-21 11:30:49.681324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.681381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.331 [2024-07-21 11:30:49.681397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.681454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.331 [2024-07-21 11:30:49.681470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 #43 NEW cov: 11786 ft: 13685 corp: 19/1520b lim: 100 exec/s: 0 rss: 69Mb L: 81/94 MS: 1 ChangeBinInt- 00:07:20.331 [2024-07-21 11:30:49.721340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.331 [2024-07-21 11:30:49.721367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.721416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.331 [2024-07-21 11:30:49.721430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.721494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.331 [2024-07-21 11:30:49.721509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.331 [2024-07-21 11:30:49.721562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.331 [2024-07-21 11:30:49.721577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.331 #44 NEW cov: 11786 ft: 13696 corp: 20/1605b lim: 100 exec/s: 44 rss: 69Mb L: 85/94 MS: 1 ChangeBit- 00:07:20.590 [2024-07-21 11:30:49.761491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.761518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.761570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.761583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.761650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.761665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.761718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.761732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.590 #45 NEW cov: 11786 ft: 13703 corp: 21/1690b lim: 100 exec/s: 45 rss: 69Mb L: 85/94 MS: 1 ChangeBit- 00:07:20.590 [2024-07-21 11:30:49.791583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.791610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.791650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.791664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.791718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.791733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.791788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.791800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.590 #46 NEW cov: 11786 ft: 13730 corp: 22/1782b lim: 100 exec/s: 46 rss: 69Mb L: 92/94 MS: 1 ChangeBinInt- 00:07:20.590 [2024-07-21 11:30:49.831688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.831715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.831758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.831772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.831826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.831841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.831894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.831912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.590 #47 NEW cov: 11786 ft: 13768 corp: 23/1868b lim: 100 exec/s: 47 rss: 69Mb L: 86/94 MS: 1 ChangeBit- 00:07:20.590 [2024-07-21 11:30:49.871824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.871850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.871894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.871907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.871962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.871977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.872030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.872043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.590 #48 NEW cov: 11786 ft: 13780 corp: 24/1953b lim: 100 exec/s: 48 rss: 70Mb L: 85/94 MS: 1 PersAutoDict- DE: "\001\000\000\032"- 00:07:20.590 [2024-07-21 11:30:49.911910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.911937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.911982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.911996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.912050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.912065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.912117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.912132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.590 #49 NEW cov: 11786 ft: 13876 corp: 25/2038b lim: 100 exec/s: 49 rss: 70Mb L: 85/94 MS: 1 ChangeBit- 00:07:20.590 [2024-07-21 11:30:49.952054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.952081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.952128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.952142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.952193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.952208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.952260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.952274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.590 #50 NEW cov: 11786 ft: 13893 corp: 26/2122b lim: 100 exec/s: 50 rss: 70Mb L: 84/94 MS: 1 EraseBytes- 00:07:20.590 [2024-07-21 11:30:49.992141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.590 [2024-07-21 11:30:49.992168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.992203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.590 [2024-07-21 11:30:49.992217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.992274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.590 [2024-07-21 11:30:49.992289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.590 [2024-07-21 11:30:49.992343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.590 [2024-07-21 11:30:49.992357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.848 #51 NEW cov: 11786 ft: 13909 corp: 27/2203b lim: 100 exec/s: 51 rss: 70Mb L: 81/94 MS: 1 ChangeBit- 00:07:20.848 [2024-07-21 11:30:50.032290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.848 [2024-07-21 11:30:50.032318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.032358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.848 [2024-07-21 11:30:50.032373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.032429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.848 [2024-07-21 11:30:50.032446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.032503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.848 [2024-07-21 11:30:50.032518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.848 #52 NEW cov: 11786 ft: 13930 corp: 28/2299b lim: 100 exec/s: 52 rss: 70Mb L: 96/96 MS: 1 PersAutoDict- DE: "\001\000\000\032"- 00:07:20.848 [2024-07-21 11:30:50.072326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.848 [2024-07-21 11:30:50.072356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.072395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.848 [2024-07-21 11:30:50.072411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.072470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.848 [2024-07-21 11:30:50.072485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.848 #53 NEW cov: 11786 ft: 13974 corp: 29/2373b lim: 100 exec/s: 53 rss: 70Mb L: 74/96 MS: 1 EraseBytes- 00:07:20.848 [2024-07-21 11:30:50.112524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.848 [2024-07-21 11:30:50.112554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.112591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.848 [2024-07-21 11:30:50.112606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.112664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.848 [2024-07-21 11:30:50.112678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.112734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.848 [2024-07-21 11:30:50.112749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.848 #54 NEW cov: 11786 ft: 13997 corp: 30/2458b lim: 100 exec/s: 54 rss: 70Mb L: 85/96 MS: 1 ChangeBit- 00:07:20.848 [2024-07-21 11:30:50.152686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.848 [2024-07-21 11:30:50.152713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.152758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.848 [2024-07-21 11:30:50.152772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.152828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.848 [2024-07-21 11:30:50.152843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.152900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.848 [2024-07-21 11:30:50.152914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.848 #55 NEW cov: 11786 ft: 14056 corp: 31/2539b lim: 100 exec/s: 55 rss: 70Mb L: 81/96 MS: 1 ChangeBit- 00:07:20.848 [2024-07-21 11:30:50.192608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.848 [2024-07-21 11:30:50.192635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.192670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.848 [2024-07-21 11:30:50.192685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.192740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.848 [2024-07-21 11:30:50.192755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.848 #56 NEW cov: 11786 ft: 14062 corp: 32/2617b lim: 100 exec/s: 56 rss: 70Mb L: 78/96 MS: 1 ShuffleBytes- 00:07:20.848 [2024-07-21 11:30:50.232927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:20.848 [2024-07-21 11:30:50.232955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.232998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:20.848 [2024-07-21 11:30:50.233012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.233066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:20.848 [2024-07-21 11:30:50.233081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.848 [2024-07-21 11:30:50.233138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:20.848 [2024-07-21 11:30:50.233152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.848 #57 NEW cov: 11786 ft: 14084 corp: 33/2709b lim: 100 exec/s: 57 rss: 70Mb L: 92/96 MS: 1 PersAutoDict- DE: "\001\000\000\032"- 00:07:21.106 [2024-07-21 11:30:50.272999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.106 [2024-07-21 11:30:50.273027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.273067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.273082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.273139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.107 [2024-07-21 11:30:50.273154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.273209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.107 [2024-07-21 11:30:50.273223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.313113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.107 [2024-07-21 11:30:50.313140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.313181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.313194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.313249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.107 [2024-07-21 11:30:50.313264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.313319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.107 [2024-07-21 11:30:50.313331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.107 #59 NEW cov: 11786 ft: 14090 corp: 34/2790b lim: 100 exec/s: 59 rss: 70Mb L: 81/96 MS: 2 PersAutoDict-CrossOver- DE: "\001\000\000\032"- 00:07:21.107 [2024-07-21 11:30:50.353308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.107 [2024-07-21 11:30:50.353335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.353382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.353396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.353454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.107 [2024-07-21 11:30:50.353469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.353519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.107 [2024-07-21 11:30:50.353534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.107 #60 NEW cov: 11786 ft: 14131 corp: 35/2876b lim: 100 exec/s: 60 rss: 70Mb L: 86/96 MS: 1 ShuffleBytes- 00:07:21.107 [2024-07-21 11:30:50.393106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.107 [2024-07-21 11:30:50.393133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.393182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.393197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 #61 NEW cov: 11786 ft: 14433 corp: 36/2916b lim: 100 exec/s: 61 rss: 70Mb L: 40/96 MS: 1 InsertRepeatedBytes- 00:07:21.107 [2024-07-21 11:30:50.433455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.107 [2024-07-21 11:30:50.433484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.433530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.433544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.433600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.107 [2024-07-21 11:30:50.433615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.433672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.107 [2024-07-21 11:30:50.433685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.107 #62 NEW cov: 11786 ft: 14439 corp: 37/3008b lim: 100 exec/s: 62 rss: 70Mb L: 92/96 MS: 1 CopyPart- 00:07:21.107 [2024-07-21 11:30:50.473451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.107 [2024-07-21 11:30:50.473481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.473522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.473535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.473590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.107 [2024-07-21 11:30:50.473616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.107 #63 NEW cov: 11786 ft: 14460 corp: 38/3075b lim: 100 exec/s: 63 rss: 70Mb L: 67/96 MS: 1 EraseBytes- 00:07:21.107 [2024-07-21 11:30:50.513750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.107 [2024-07-21 11:30:50.513778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.513820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.107 [2024-07-21 11:30:50.513832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.513887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.107 [2024-07-21 11:30:50.513902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.107 [2024-07-21 11:30:50.513957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.107 [2024-07-21 11:30:50.513970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.365 #64 NEW cov: 11786 ft: 14469 corp: 39/3156b lim: 100 exec/s: 64 rss: 70Mb L: 81/96 MS: 1 ShuffleBytes- 00:07:21.365 [2024-07-21 11:30:50.553872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.365 [2024-07-21 11:30:50.553900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.365 [2024-07-21 11:30:50.553940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.365 [2024-07-21 11:30:50.553955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.365 [2024-07-21 11:30:50.554011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.365 [2024-07-21 11:30:50.554026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.554081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.366 [2024-07-21 11:30:50.554097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.366 #65 NEW cov: 11786 ft: 14476 corp: 40/3248b lim: 100 exec/s: 65 rss: 70Mb L: 92/96 MS: 1 ChangeBit- 00:07:21.366 [2024-07-21 11:30:50.593941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.366 [2024-07-21 11:30:50.593969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.594027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.366 [2024-07-21 11:30:50.594042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.594096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.366 [2024-07-21 11:30:50.594112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.594167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.366 [2024-07-21 11:30:50.594180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.366 #66 NEW cov: 11786 ft: 14485 corp: 41/3337b lim: 100 exec/s: 66 rss: 70Mb L: 89/96 MS: 1 InsertRepeatedBytes- 00:07:21.366 [2024-07-21 11:30:50.633856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.366 [2024-07-21 11:30:50.633885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.633940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.366 [2024-07-21 11:30:50.633956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.366 #67 NEW cov: 11786 ft: 14489 corp: 42/3382b lim: 100 exec/s: 67 rss: 70Mb L: 45/96 MS: 1 EraseBytes- 00:07:21.366 [2024-07-21 11:30:50.674215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.366 [2024-07-21 11:30:50.674243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.674278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.366 [2024-07-21 11:30:50.674294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.674349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.366 [2024-07-21 11:30:50.674365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.674421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.366 [2024-07-21 11:30:50.674436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.366 #68 NEW cov: 11786 ft: 14498 corp: 43/3463b lim: 100 exec/s: 68 rss: 70Mb L: 81/96 MS: 1 ChangeBit- 00:07:21.366 [2024-07-21 11:30:50.714200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.366 [2024-07-21 11:30:50.714230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.714259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.366 [2024-07-21 11:30:50.714275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.366 [2024-07-21 11:30:50.714331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.366 [2024-07-21 11:30:50.714346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.366 #69 NEW cov: 11786 ft: 14510 corp: 44/3542b lim: 100 exec/s: 34 rss: 70Mb L: 79/96 MS: 1 InsertByte- 00:07:21.366 #69 DONE cov: 11786 ft: 14510 corp: 44/3542b lim: 100 exec/s: 34 rss: 70Mb 00:07:21.366 ###### Recommended dictionary. ###### 00:07:21.366 "\001\000\000\032" # Uses: 6 00:07:21.366 ###### End of recommended dictionary. ###### 00:07:21.366 Done 69 runs in 2 second(s) 00:07:21.623 11:30:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:21.623 11:30:50 -- ../common.sh@72 -- # (( i++ )) 00:07:21.623 11:30:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:21.623 11:30:50 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:21.623 11:30:50 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:21.623 11:30:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:21.623 11:30:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:21.623 11:30:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:21.623 11:30:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:21.623 11:30:50 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:21.623 11:30:50 -- nvmf/run.sh@29 -- # port=4419 00:07:21.623 11:30:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:21.623 11:30:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:21.623 11:30:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:21.623 11:30:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:21.623 [2024-07-21 11:30:50.898438] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:21.623 [2024-07-21 11:30:50.898538] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2067772 ] 00:07:21.623 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.881 [2024-07-21 11:30:51.077812] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.881 [2024-07-21 11:30:51.097943] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.881 [2024-07-21 11:30:51.098069] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.881 [2024-07-21 11:30:51.149556] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:21.881 [2024-07-21 11:30:51.165901] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:21.881 INFO: Running with entropic power schedule (0xFF, 100). 00:07:21.881 INFO: Seed: 1666169657 00:07:21.881 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:21.881 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:21.881 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:21.881 INFO: A corpus is not provided, starting from an empty corpus 00:07:21.881 #2 INITED exec/s: 0 rss: 61Mb 00:07:21.881 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:21.881 This may also happen if the target rejected all inputs we tried so far 00:07:21.881 [2024-07-21 11:30:51.211112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:21.881 [2024-07-21 11:30:51.211142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.881 [2024-07-21 11:30:51.211193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:21.881 [2024-07-21 11:30:51.211209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.881 [2024-07-21 11:30:51.211258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:21.882 [2024-07-21 11:30:51.211273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.139 NEW_FUNC[1/670]: 0x4bec40 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:22.139 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.139 #6 NEW cov: 11537 ft: 11538 corp: 2/37b lim: 50 exec/s: 0 rss: 68Mb L: 36/36 MS: 4 CrossOver-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:22.139 [2024-07-21 11:30:51.521919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.139 [2024-07-21 11:30:51.521955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.139 [2024-07-21 11:30:51.521999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.139 [2024-07-21 11:30:51.522015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.139 [2024-07-21 11:30:51.522065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.139 [2024-07-21 11:30:51.522081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.140 [2024-07-21 11:30:51.522130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.140 [2024-07-21 11:30:51.522145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.140 #7 NEW cov: 11650 ft: 12363 corp: 3/81b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 CMP- DE: "\377.\217\201o\3428^"- 00:07:22.398 [2024-07-21 11:30:51.571981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021212200156 len:56541 00:07:22.398 [2024-07-21 11:30:51.572012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.572039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.572055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.572103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.572119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.572168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.398 [2024-07-21 11:30:51.572184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 #8 NEW cov: 11656 ft: 12504 corp: 4/125b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 ChangeByte- 00:07:22.398 [2024-07-21 11:30:51.612110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:22.398 [2024-07-21 11:30:51.612139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.612168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.398 [2024-07-21 11:30:51.612182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.612231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.612247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.612297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:65327 00:07:22.398 [2024-07-21 11:30:51.612312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 #9 NEW cov: 11741 ft: 12824 corp: 5/173b lim: 50 exec/s: 0 rss: 69Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:07:22.398 [2024-07-21 11:30:51.652220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.398 [2024-07-21 11:30:51.652247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.652281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.652295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.652343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024368479452 len:56541 00:07:22.398 [2024-07-21 11:30:51.652358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.652405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.398 [2024-07-21 11:30:51.652418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 #10 NEW cov: 11741 ft: 12879 corp: 6/217b lim: 50 exec/s: 0 rss: 69Mb L: 44/48 MS: 1 ChangeBit- 00:07:22.398 [2024-07-21 11:30:51.692354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:22.398 [2024-07-21 11:30:51.692382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.692414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.398 [2024-07-21 11:30:51.692430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.692498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376867898 len:56541 00:07:22.398 [2024-07-21 11:30:51.692518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.692567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56576 00:07:22.398 [2024-07-21 11:30:51.692584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 #11 NEW cov: 11741 ft: 13066 corp: 7/266b lim: 50 exec/s: 0 rss: 69Mb L: 49/49 MS: 1 InsertByte- 00:07:22.398 [2024-07-21 11:30:51.732472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:22.398 [2024-07-21 11:30:51.732500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.732534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.398 [2024-07-21 11:30:51.732549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.732598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.732614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.732661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:65327 00:07:22.398 [2024-07-21 11:30:51.732677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 #12 NEW cov: 11741 ft: 13118 corp: 8/314b lim: 50 exec/s: 0 rss: 69Mb L: 48/49 MS: 1 ShuffleBytes- 00:07:22.398 [2024-07-21 11:30:51.772554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.398 [2024-07-21 11:30:51.772583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.772617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.772632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.772680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.772695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.772760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.398 [2024-07-21 11:30:51.772776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 #13 NEW cov: 11741 ft: 13151 corp: 9/358b lim: 50 exec/s: 0 rss: 69Mb L: 44/49 MS: 1 ChangeASCIIInt- 00:07:22.398 [2024-07-21 11:30:51.812812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.398 [2024-07-21 11:30:51.812840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.812880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9404222470465838210 len:33501 00:07:22.398 [2024-07-21 11:30:51.812895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.812943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914837474621054172 len:56541 00:07:22.398 [2024-07-21 11:30:51.812961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.813010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 00:07:22.398 [2024-07-21 11:30:51.813025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.398 [2024-07-21 11:30:51.813073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:8062068263605931905 len:2571 00:07:22.398 [2024-07-21 11:30:51.813088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:22.656 #14 NEW cov: 11741 ft: 13241 corp: 10/408b lim: 50 exec/s: 0 rss: 69Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:07:22.656 [2024-07-21 11:30:51.852789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021212200156 len:56541 00:07:22.656 [2024-07-21 11:30:51.852817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.852849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.656 [2024-07-21 11:30:51.852863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.852912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.656 [2024-07-21 11:30:51.852926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.852976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.656 [2024-07-21 11:30:51.852991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.656 #15 NEW cov: 11741 ft: 13275 corp: 11/452b lim: 50 exec/s: 0 rss: 69Mb L: 44/50 MS: 1 ChangeASCIIInt- 00:07:22.656 [2024-07-21 11:30:51.892930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:22.656 [2024-07-21 11:30:51.892959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.892993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.656 [2024-07-21 11:30:51.893008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.893056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376867904 len:56541 00:07:22.656 [2024-07-21 11:30:51.893071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.893121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56576 00:07:22.656 [2024-07-21 11:30:51.893136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.656 #16 NEW cov: 11741 ft: 13297 corp: 12/501b lim: 50 exec/s: 0 rss: 69Mb L: 49/50 MS: 1 ChangeBinInt- 00:07:22.656 [2024-07-21 11:30:51.932724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:07:22.656 [2024-07-21 11:30:51.932752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.656 #17 NEW cov: 11741 ft: 13697 corp: 13/516b lim: 50 exec/s: 0 rss: 69Mb L: 15/50 MS: 1 InsertRepeatedBytes- 00:07:22.656 [2024-07-21 11:30:51.972953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.656 [2024-07-21 11:30:51.972982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:51.973032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56547 00:07:22.656 [2024-07-21 11:30:51.973049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.656 #18 NEW cov: 11741 ft: 13939 corp: 14/540b lim: 50 exec/s: 0 rss: 69Mb L: 24/50 MS: 1 EraseBytes- 00:07:22.656 [2024-07-21 11:30:52.013168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021256305884 len:56541 00:07:22.656 [2024-07-21 11:30:52.013197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:52.013235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.656 [2024-07-21 11:30:52.013249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:52.013301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.656 [2024-07-21 11:30:52.013316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.656 #19 NEW cov: 11741 ft: 13966 corp: 15/576b lim: 50 exec/s: 0 rss: 69Mb L: 36/50 MS: 1 ChangeBit- 00:07:22.656 [2024-07-21 11:30:52.053357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.656 [2024-07-21 11:30:52.053386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:52.053414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.656 [2024-07-21 11:30:52.053428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.656 [2024-07-21 11:30:52.053454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.657 [2024-07-21 11:30:52.053466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.657 [2024-07-21 11:30:52.053482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.657 [2024-07-21 11:30:52.053497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.657 #20 NEW cov: 11741 ft: 13983 corp: 16/620b lim: 50 exec/s: 0 rss: 69Mb L: 44/50 MS: 1 ShuffleBytes- 00:07:22.914 [2024-07-21 11:30:52.093493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021212200156 len:56541 00:07:22.914 [2024-07-21 11:30:52.093522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.093556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.914 [2024-07-21 11:30:52.093572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.093625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.914 [2024-07-21 11:30:52.093642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.093692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:22.914 [2024-07-21 11:30:52.093708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.914 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:22.914 #21 NEW cov: 11764 ft: 14023 corp: 17/664b lim: 50 exec/s: 0 rss: 70Mb L: 44/50 MS: 1 ChangeASCIIInt- 00:07:22.914 [2024-07-21 11:30:52.133591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:22.914 [2024-07-21 11:30:52.133619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.133652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.914 [2024-07-21 11:30:52.133667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.133716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:257 00:07:22.914 [2024-07-21 11:30:52.133733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.133783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:242837450915840 len:65327 00:07:22.914 [2024-07-21 11:30:52.133797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.914 #22 NEW cov: 11764 ft: 14050 corp: 18/712b lim: 50 exec/s: 0 rss: 70Mb L: 48/50 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:22.914 [2024-07-21 11:30:52.173695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.914 [2024-07-21 11:30:52.173723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.173756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9404222470465838210 len:33501 00:07:22.914 [2024-07-21 11:30:52.173771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.173821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376867872 len:23773 00:07:22.914 [2024-07-21 11:30:52.173837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.914 #23 NEW cov: 11764 ft: 14056 corp: 19/751b lim: 50 exec/s: 23 rss: 70Mb L: 39/50 MS: 1 CrossOver- 00:07:22.914 [2024-07-21 11:30:52.213844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319350492 len:1 00:07:22.914 [2024-07-21 11:30:52.213873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.213905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.914 [2024-07-21 11:30:52.213919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.213969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.914 [2024-07-21 11:30:52.213987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.214037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:65327 00:07:22.914 [2024-07-21 11:30:52.214051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.914 #24 NEW cov: 11764 ft: 14066 corp: 20/799b lim: 50 exec/s: 24 rss: 70Mb L: 48/50 MS: 1 ChangeByte- 00:07:22.914 [2024-07-21 11:30:52.253974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319350492 len:1 00:07:22.914 [2024-07-21 11:30:52.254001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.254035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:22.914 [2024-07-21 11:30:52.254049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.254097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:22.914 [2024-07-21 11:30:52.254113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.254162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:65327 00:07:22.914 [2024-07-21 11:30:52.254177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.914 #25 NEW cov: 11764 ft: 14093 corp: 21/848b lim: 50 exec/s: 25 rss: 70Mb L: 49/50 MS: 1 CrossOver- 00:07:22.914 [2024-07-21 11:30:52.294077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:22.914 [2024-07-21 11:30:52.294105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.294135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:22.914 [2024-07-21 11:30:52.294149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.294198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914598330833624284 len:56541 00:07:22.914 [2024-07-21 11:30:52.294213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.914 [2024-07-21 11:30:52.294262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15924498002826026204 len:33136 00:07:22.914 [2024-07-21 11:30:52.294277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.914 #26 NEW cov: 11764 ft: 14115 corp: 22/893b lim: 50 exec/s: 26 rss: 70Mb L: 45/50 MS: 1 InsertByte- 00:07:22.914 [2024-07-21 11:30:52.333871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16777216 len:2828 00:07:22.914 [2024-07-21 11:30:52.333899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.172 #29 NEW cov: 11764 ft: 14182 corp: 23/903b lim: 50 exec/s: 29 rss: 70Mb L: 10/50 MS: 3 ChangeBit-CopyPart-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:23.172 [2024-07-21 11:30:52.374336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319350492 len:1 00:07:23.172 [2024-07-21 11:30:52.374367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.374394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:23.172 [2024-07-21 11:30:52.374407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.374457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:23.172 [2024-07-21 11:30:52.374473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.374523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:28473 00:07:23.172 [2024-07-21 11:30:52.374537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.172 #30 NEW cov: 11764 ft: 14208 corp: 24/951b lim: 50 exec/s: 30 rss: 70Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:23.172 [2024-07-21 11:30:52.414421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021212200156 len:56541 00:07:23.172 [2024-07-21 11:30:52.414455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.414482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:23.172 [2024-07-21 11:30:52.414498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.414547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:23.172 [2024-07-21 11:30:52.414562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.414611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28643 00:07:23.172 [2024-07-21 11:30:52.414626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.172 #31 NEW cov: 11764 ft: 14214 corp: 25/995b lim: 50 exec/s: 31 rss: 70Mb L: 44/50 MS: 1 CopyPart- 00:07:23.172 [2024-07-21 11:30:52.454585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:23.172 [2024-07-21 11:30:52.454623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.454656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:23.172 [2024-07-21 11:30:52.454671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.454719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024368479452 len:56541 00:07:23.172 [2024-07-21 11:30:52.454735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.172 [2024-07-21 11:30:52.454784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18387792116519328988 len:28659 00:07:23.172 [2024-07-21 11:30:52.454800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.173 #32 NEW cov: 11764 ft: 14235 corp: 26/1039b lim: 50 exec/s: 32 rss: 70Mb L: 44/50 MS: 1 ChangeBit- 00:07:23.173 [2024-07-21 11:30:52.494660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:23.173 [2024-07-21 11:30:52.494691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.494718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:23.173 [2024-07-21 11:30:52.494733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.494782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:23.173 [2024-07-21 11:30:52.494798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.494848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:65327 00:07:23.173 [2024-07-21 11:30:52.494861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.173 #33 NEW cov: 11764 ft: 14236 corp: 27/1087b lim: 50 exec/s: 33 rss: 70Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:23.173 [2024-07-21 11:30:52.524766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021212200156 len:56541 00:07:23.173 [2024-07-21 11:30:52.524793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.524827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:23.173 [2024-07-21 11:30:52.524842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.524892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15924728281792568540 len:56541 00:07:23.173 [2024-07-21 11:30:52.524906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.524956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838174700723420 len:11920 00:07:23.173 [2024-07-21 11:30:52.524970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.173 #34 NEW cov: 11764 ft: 14307 corp: 28/1134b lim: 50 exec/s: 34 rss: 70Mb L: 47/50 MS: 1 InsertRepeatedBytes- 00:07:23.173 [2024-07-21 11:30:52.564685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:23.173 [2024-07-21 11:30:52.564713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.173 [2024-07-21 11:30:52.564751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56547 00:07:23.173 [2024-07-21 11:30:52.564766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.173 #35 NEW cov: 11764 ft: 14344 corp: 29/1158b lim: 50 exec/s: 35 rss: 70Mb L: 24/50 MS: 1 ChangeByte- 00:07:23.430 [2024-07-21 11:30:52.604875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021256305884 len:56541 00:07:23.430 [2024-07-21 11:30:52.604902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.604940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:23.430 [2024-07-21 11:30:52.604955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.605008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8062068263605931905 len:56541 00:07:23.430 [2024-07-21 11:30:52.605023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.430 #36 NEW cov: 11764 ft: 14353 corp: 30/1194b lim: 50 exec/s: 36 rss: 70Mb L: 36/50 MS: 1 PersAutoDict- DE: "\377.\217\201o\3428^"- 00:07:23.430 [2024-07-21 11:30:52.644797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16777216 len:2833 00:07:23.430 [2024-07-21 11:30:52.644826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 #37 NEW cov: 11764 ft: 14371 corp: 31/1205b lim: 50 exec/s: 37 rss: 70Mb L: 11/50 MS: 1 InsertByte- 00:07:23.430 [2024-07-21 11:30:52.685130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021256305884 len:56541 00:07:23.430 [2024-07-21 11:30:52.685158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.685190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838021507964124 len:56541 00:07:23.430 [2024-07-21 11:30:52.685205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.685254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8062068263605931905 len:56541 00:07:23.430 [2024-07-21 11:30:52.685269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.430 #38 NEW cov: 11764 ft: 14415 corp: 32/1241b lim: 50 exec/s: 38 rss: 70Mb L: 36/50 MS: 1 ChangeByte- 00:07:23.430 [2024-07-21 11:30:52.725247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021256305884 len:56541 00:07:23.430 [2024-07-21 11:30:52.725276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.725307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838021507964124 len:56541 00:07:23.430 [2024-07-21 11:30:52.725321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.725370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3355042565436903423 len:57913 00:07:23.430 [2024-07-21 11:30:52.725386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.430 #39 NEW cov: 11764 ft: 14450 corp: 33/1277b lim: 50 exec/s: 39 rss: 70Mb L: 36/50 MS: 1 PersAutoDict- DE: "\377.\217\201o\3428^"- 00:07:23.430 [2024-07-21 11:30:52.765403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319395036 len:1 00:07:23.430 [2024-07-21 11:30:52.765431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.765479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:23.430 [2024-07-21 11:30:52.765495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.765544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15866142853405924572 len:56322 00:07:23.430 [2024-07-21 11:30:52.765559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.765610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:944892805120 len:56576 00:07:23.430 [2024-07-21 11:30:52.765624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.430 #40 NEW cov: 11764 ft: 14457 corp: 34/1326b lim: 50 exec/s: 40 rss: 70Mb L: 49/50 MS: 1 InsertByte- 00:07:23.430 [2024-07-21 11:30:52.805271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4251398048405520384 len:1 00:07:23.430 [2024-07-21 11:30:52.805299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 #41 NEW cov: 11764 ft: 14469 corp: 35/1341b lim: 50 exec/s: 41 rss: 70Mb L: 15/50 MS: 1 ChangeByte- 00:07:23.430 [2024-07-21 11:30:52.845693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:23.430 [2024-07-21 11:30:52.845721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.430 [2024-07-21 11:30:52.845754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:07:23.431 [2024-07-21 11:30:52.845769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.431 [2024-07-21 11:30:52.845816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:07:23.431 [2024-07-21 11:30:52.845833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.431 [2024-07-21 11:30:52.845880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15924497999856458972 len:33136 00:07:23.431 [2024-07-21 11:30:52.845895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.688 #42 NEW cov: 11764 ft: 14473 corp: 36/1386b lim: 50 exec/s: 42 rss: 70Mb L: 45/50 MS: 1 InsertByte- 00:07:23.688 [2024-07-21 11:30:52.885509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16777216 len:11025 00:07:23.688 [2024-07-21 11:30:52.885537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.688 #43 NEW cov: 11764 ft: 14481 corp: 37/1397b lim: 50 exec/s: 43 rss: 70Mb L: 11/50 MS: 1 ChangeBit- 00:07:23.688 [2024-07-21 11:30:52.925702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:23.688 [2024-07-21 11:30:52.925730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:52.925767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15862847617057479900 len:56547 00:07:23.688 [2024-07-21 11:30:52.925782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.688 #44 NEW cov: 11764 ft: 14485 corp: 38/1421b lim: 50 exec/s: 44 rss: 70Mb L: 24/50 MS: 1 ChangeBinInt- 00:07:23.688 [2024-07-21 11:30:52.965756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16777216 len:2827 00:07:23.688 [2024-07-21 11:30:52.965784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:52.965831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:184549376 len:1 00:07:23.688 [2024-07-21 11:30:52.965847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.688 #45 NEW cov: 11764 ft: 14496 corp: 39/1446b lim: 50 exec/s: 45 rss: 70Mb L: 25/50 MS: 1 CrossOver- 00:07:23.688 [2024-07-21 11:30:53.005919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:23.688 [2024-07-21 11:30:53.005947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.005980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914843917071998172 len:24075 00:07:23.688 [2024-07-21 11:30:53.005996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.688 #46 NEW cov: 11764 ft: 14518 corp: 40/1467b lim: 50 exec/s: 46 rss: 71Mb L: 21/50 MS: 1 EraseBytes- 00:07:23.688 [2024-07-21 11:30:53.046235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914837076319350492 len:1 00:07:23.688 [2024-07-21 11:30:53.046264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.046299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:07:23.688 [2024-07-21 11:30:53.046314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.046363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56029 00:07:23.688 [2024-07-21 11:30:53.046379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.046426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:65327 00:07:23.688 [2024-07-21 11:30:53.046440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.688 #47 NEW cov: 11764 ft: 14532 corp: 41/1516b lim: 50 exec/s: 47 rss: 71Mb L: 49/50 MS: 1 ChangeBinInt- 00:07:23.688 [2024-07-21 11:30:53.086354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7306357455619567205 len:25958 00:07:23.688 [2024-07-21 11:30:53.086382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.086413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7306357456645743973 len:25958 00:07:23.688 [2024-07-21 11:30:53.086428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.086483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:7306357456645743973 len:25958 00:07:23.688 [2024-07-21 11:30:53.086498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.688 [2024-07-21 11:30:53.086549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:7306357456645743973 len:25958 00:07:23.688 [2024-07-21 11:30:53.086564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.947 #50 NEW cov: 11764 ft: 14540 corp: 42/1565b lim: 50 exec/s: 50 rss: 71Mb L: 49/50 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:23.947 [2024-07-21 11:30:53.126169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:07:23.947 [2024-07-21 11:30:53.126198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.947 #51 NEW cov: 11764 ft: 14542 corp: 43/1581b lim: 50 exec/s: 51 rss: 71Mb L: 16/50 MS: 1 InsertByte- 00:07:23.947 [2024-07-21 11:30:53.166295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167787264 len:1 00:07:23.947 [2024-07-21 11:30:53.166324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.947 #52 NEW cov: 11764 ft: 14544 corp: 44/1596b lim: 50 exec/s: 52 rss: 71Mb L: 15/50 MS: 1 CopyPart- 00:07:23.947 [2024-07-21 11:30:53.206591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021222751452 len:56541 00:07:23.947 [2024-07-21 11:30:53.206619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.947 [2024-07-21 11:30:53.206650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9404222470465838210 len:33501 00:07:23.947 [2024-07-21 11:30:53.206664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.947 [2024-07-21 11:30:53.206713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15853195004477234208 len:23773 00:07:23.947 [2024-07-21 11:30:53.206729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.947 #53 NEW cov: 11764 ft: 14545 corp: 45/1635b lim: 50 exec/s: 26 rss: 71Mb L: 39/50 MS: 1 ChangeByte- 00:07:23.947 #53 DONE cov: 11764 ft: 14545 corp: 45/1635b lim: 50 exec/s: 26 rss: 71Mb 00:07:23.947 ###### Recommended dictionary. ###### 00:07:23.947 "\377.\217\201o\3428^" # Uses: 2 00:07:23.947 "\001\000\000\000\000\000\000\000" # Uses: 1 00:07:23.947 ###### End of recommended dictionary. ###### 00:07:23.947 Done 53 runs in 2 second(s) 00:07:23.947 11:30:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:23.947 11:30:53 -- ../common.sh@72 -- # (( i++ )) 00:07:23.947 11:30:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:23.947 11:30:53 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:23.947 11:30:53 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:23.947 11:30:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:23.947 11:30:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:23.947 11:30:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:23.947 11:30:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:23.947 11:30:53 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:23.947 11:30:53 -- nvmf/run.sh@29 -- # port=4420 00:07:23.947 11:30:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:23.947 11:30:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:23.947 11:30:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:23.947 11:30:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:24.205 [2024-07-21 11:30:53.388709] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:24.205 [2024-07-21 11:30:53.388777] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2068097 ] 00:07:24.205 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.205 [2024-07-21 11:30:53.571416] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.205 [2024-07-21 11:30:53.592185] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.205 [2024-07-21 11:30:53.592310] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.462 [2024-07-21 11:30:53.644000] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.462 [2024-07-21 11:30:53.660349] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:24.462 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.462 INFO: Seed: 4161171181 00:07:24.462 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:24.462 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:24.462 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:24.462 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.462 #2 INITED exec/s: 0 rss: 59Mb 00:07:24.462 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.462 This may also happen if the target rejected all inputs we tried so far 00:07:24.462 [2024-07-21 11:30:53.726409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.462 [2024-07-21 11:30:53.726449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.462 [2024-07-21 11:30:53.726575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.462 [2024-07-21 11:30:53.726596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.720 NEW_FUNC[1/672]: 0x4c0800 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:24.720 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.720 #4 NEW cov: 11595 ft: 11589 corp: 2/43b lim: 90 exec/s: 0 rss: 67Mb L: 42/42 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:24.720 [2024-07-21 11:30:54.067420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.720 [2024-07-21 11:30:54.067483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.720 [2024-07-21 11:30:54.067631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.720 [2024-07-21 11:30:54.067658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.720 #5 NEW cov: 11708 ft: 12206 corp: 3/85b lim: 90 exec/s: 0 rss: 67Mb L: 42/42 MS: 1 CopyPart- 00:07:24.720 [2024-07-21 11:30:54.117140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.720 [2024-07-21 11:30:54.117169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.720 [2024-07-21 11:30:54.117295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.720 [2024-07-21 11:30:54.117313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.720 #6 NEW cov: 11714 ft: 12476 corp: 4/128b lim: 90 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 InsertByte- 00:07:24.979 [2024-07-21 11:30:54.157983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.979 [2024-07-21 11:30:54.158016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.158082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.979 [2024-07-21 11:30:54.158101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.158219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:24.979 [2024-07-21 11:30:54.158241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.158370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:24.979 [2024-07-21 11:30:54.158395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.979 #7 NEW cov: 11799 ft: 13109 corp: 5/213b lim: 90 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:07:24.979 [2024-07-21 11:30:54.207703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.979 [2024-07-21 11:30:54.207734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.207805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.979 [2024-07-21 11:30:54.207825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.207946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:24.979 [2024-07-21 11:30:54.207968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.208085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:24.979 [2024-07-21 11:30:54.208105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.979 #8 NEW cov: 11799 ft: 13193 corp: 6/298b lim: 90 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 ChangeByte- 00:07:24.979 [2024-07-21 11:30:54.247730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.979 [2024-07-21 11:30:54.247760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.247872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.979 [2024-07-21 11:30:54.247895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.979 #9 NEW cov: 11799 ft: 13307 corp: 7/339b lim: 90 exec/s: 0 rss: 68Mb L: 41/85 MS: 1 InsertRepeatedBytes- 00:07:24.979 [2024-07-21 11:30:54.288393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.979 [2024-07-21 11:30:54.288424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.979 [2024-07-21 11:30:54.288478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.980 [2024-07-21 11:30:54.288501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.288625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:24.980 [2024-07-21 11:30:54.288648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.288767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:24.980 [2024-07-21 11:30:54.288785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.980 #15 NEW cov: 11799 ft: 13350 corp: 8/424b lim: 90 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 ShuffleBytes- 00:07:24.980 [2024-07-21 11:30:54.328404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.980 [2024-07-21 11:30:54.328435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.328495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.980 [2024-07-21 11:30:54.328518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.328646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:24.980 [2024-07-21 11:30:54.328667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.328783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:24.980 [2024-07-21 11:30:54.328806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.980 #16 NEW cov: 11799 ft: 13400 corp: 9/509b lim: 90 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 ChangeByte- 00:07:24.980 [2024-07-21 11:30:54.368166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:24.980 [2024-07-21 11:30:54.368195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.368246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:24.980 [2024-07-21 11:30:54.368266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.368383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:24.980 [2024-07-21 11:30:54.368406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.980 [2024-07-21 11:30:54.368530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:24.980 [2024-07-21 11:30:54.368551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.980 #17 NEW cov: 11799 ft: 13449 corp: 10/594b lim: 90 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 ChangeByte- 00:07:25.238 [2024-07-21 11:30:54.408181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.238 [2024-07-21 11:30:54.408214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.238 [2024-07-21 11:30:54.408324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.238 [2024-07-21 11:30:54.408344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.238 #18 NEW cov: 11799 ft: 13543 corp: 11/636b lim: 90 exec/s: 0 rss: 68Mb L: 42/85 MS: 1 ChangeBinInt- 00:07:25.238 [2024-07-21 11:30:54.448852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.238 [2024-07-21 11:30:54.448885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.238 [2024-07-21 11:30:54.448979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.238 [2024-07-21 11:30:54.448999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.238 [2024-07-21 11:30:54.449124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.238 [2024-07-21 11:30:54.449144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.239 [2024-07-21 11:30:54.449265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:25.239 [2024-07-21 11:30:54.449284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.239 #19 NEW cov: 11799 ft: 13608 corp: 12/710b lim: 90 exec/s: 0 rss: 68Mb L: 74/85 MS: 1 EraseBytes- 00:07:25.239 [2024-07-21 11:30:54.498447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.239 [2024-07-21 11:30:54.498494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.239 [2024-07-21 11:30:54.498597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.239 [2024-07-21 11:30:54.498616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.239 #20 NEW cov: 11799 ft: 13629 corp: 13/753b lim: 90 exec/s: 0 rss: 68Mb L: 43/85 MS: 1 InsertByte- 00:07:25.239 [2024-07-21 11:30:54.538139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.239 [2024-07-21 11:30:54.538171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.239 [2024-07-21 11:30:54.538286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.239 [2024-07-21 11:30:54.538308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.239 #21 NEW cov: 11799 ft: 13648 corp: 14/796b lim: 90 exec/s: 0 rss: 68Mb L: 43/85 MS: 1 ShuffleBytes- 00:07:25.239 [2024-07-21 11:30:54.578499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.239 [2024-07-21 11:30:54.578526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.239 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:25.239 #22 NEW cov: 11822 ft: 14468 corp: 15/823b lim: 90 exec/s: 0 rss: 68Mb L: 27/85 MS: 1 EraseBytes- 00:07:25.239 [2024-07-21 11:30:54.618823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.239 [2024-07-21 11:30:54.618849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.239 [2024-07-21 11:30:54.618989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.239 [2024-07-21 11:30:54.619013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.239 #23 NEW cov: 11822 ft: 14502 corp: 16/866b lim: 90 exec/s: 0 rss: 68Mb L: 43/85 MS: 1 ChangeBit- 00:07:25.239 [2024-07-21 11:30:54.658864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.239 [2024-07-21 11:30:54.658899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.239 [2024-07-21 11:30:54.659033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.239 [2024-07-21 11:30:54.659059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.497 #24 NEW cov: 11822 ft: 14614 corp: 17/909b lim: 90 exec/s: 0 rss: 68Mb L: 43/85 MS: 1 CopyPart- 00:07:25.497 [2024-07-21 11:30:54.699033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.497 [2024-07-21 11:30:54.699066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.699164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.497 [2024-07-21 11:30:54.699187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.699309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.497 [2024-07-21 11:30:54.699342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.699471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:25.497 [2024-07-21 11:30:54.699491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.497 #25 NEW cov: 11822 ft: 14634 corp: 18/998b lim: 90 exec/s: 25 rss: 68Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:07:25.497 [2024-07-21 11:30:54.739017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.497 [2024-07-21 11:30:54.739049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.739183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.497 [2024-07-21 11:30:54.739207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.497 #26 NEW cov: 11822 ft: 14660 corp: 19/1041b lim: 90 exec/s: 26 rss: 68Mb L: 43/89 MS: 1 CopyPart- 00:07:25.497 [2024-07-21 11:30:54.779358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.497 [2024-07-21 11:30:54.779389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.779449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.497 [2024-07-21 11:30:54.779475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.779591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.497 [2024-07-21 11:30:54.779612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.779740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:25.497 [2024-07-21 11:30:54.779762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.497 #27 NEW cov: 11822 ft: 14696 corp: 20/1120b lim: 90 exec/s: 27 rss: 69Mb L: 79/89 MS: 1 EraseBytes- 00:07:25.497 [2024-07-21 11:30:54.819779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.497 [2024-07-21 11:30:54.819811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.819877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.497 [2024-07-21 11:30:54.819895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.820019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.497 [2024-07-21 11:30:54.820037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.820157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:25.497 [2024-07-21 11:30:54.820182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.497 #28 NEW cov: 11822 ft: 14728 corp: 21/1194b lim: 90 exec/s: 28 rss: 69Mb L: 74/89 MS: 1 ChangeBit- 00:07:25.497 [2024-07-21 11:30:54.859206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.497 [2024-07-21 11:30:54.859237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.497 #29 NEW cov: 11822 ft: 14765 corp: 22/1215b lim: 90 exec/s: 29 rss: 69Mb L: 21/89 MS: 1 EraseBytes- 00:07:25.497 [2024-07-21 11:30:54.900146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.497 [2024-07-21 11:30:54.900179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.900231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.497 [2024-07-21 11:30:54.900252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.900380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.497 [2024-07-21 11:30:54.900404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.497 [2024-07-21 11:30:54.900521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:25.497 [2024-07-21 11:30:54.900545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.498 #30 NEW cov: 11822 ft: 14777 corp: 23/1304b lim: 90 exec/s: 30 rss: 69Mb L: 89/89 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:25.756 [2024-07-21 11:30:54.939721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.756 [2024-07-21 11:30:54.939754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:54.939864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.756 [2024-07-21 11:30:54.939882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.756 #31 NEW cov: 11822 ft: 14798 corp: 24/1346b lim: 90 exec/s: 31 rss: 69Mb L: 42/89 MS: 1 CopyPart- 00:07:25.756 [2024-07-21 11:30:54.979835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.756 [2024-07-21 11:30:54.979864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:54.979922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.756 [2024-07-21 11:30:54.979944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.756 #32 NEW cov: 11822 ft: 14805 corp: 25/1389b lim: 90 exec/s: 32 rss: 69Mb L: 43/89 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:25.756 [2024-07-21 11:30:55.019350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.756 [2024-07-21 11:30:55.019375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.756 #33 NEW cov: 11822 ft: 14827 corp: 26/1416b lim: 90 exec/s: 33 rss: 69Mb L: 27/89 MS: 1 ChangeBit- 00:07:25.756 [2024-07-21 11:30:55.060325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.756 [2024-07-21 11:30:55.060359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:55.060445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.756 [2024-07-21 11:30:55.060466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:55.060587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.756 [2024-07-21 11:30:55.060607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.756 #34 NEW cov: 11822 ft: 15148 corp: 27/1481b lim: 90 exec/s: 34 rss: 69Mb L: 65/89 MS: 1 InsertRepeatedBytes- 00:07:25.756 [2024-07-21 11:30:55.100256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.756 [2024-07-21 11:30:55.100291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:55.100382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.756 [2024-07-21 11:30:55.100403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:55.100520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:25.756 [2024-07-21 11:30:55.100540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:55.100658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:25.756 [2024-07-21 11:30:55.100677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.756 #35 NEW cov: 11822 ft: 15160 corp: 28/1570b lim: 90 exec/s: 35 rss: 69Mb L: 89/89 MS: 1 ChangeBinInt- 00:07:25.756 [2024-07-21 11:30:55.139845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:25.756 [2024-07-21 11:30:55.139880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.756 [2024-07-21 11:30:55.140022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:25.756 [2024-07-21 11:30:55.140043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.756 #36 NEW cov: 11822 ft: 15172 corp: 29/1618b lim: 90 exec/s: 36 rss: 69Mb L: 48/89 MS: 1 EraseBytes- 00:07:26.015 [2024-07-21 11:30:55.181016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.015 [2024-07-21 11:30:55.181049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.181101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.015 [2024-07-21 11:30:55.181122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.181250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.015 [2024-07-21 11:30:55.181274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.181394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:26.015 [2024-07-21 11:30:55.181415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.015 #37 NEW cov: 11822 ft: 15176 corp: 30/1704b lim: 90 exec/s: 37 rss: 69Mb L: 86/89 MS: 1 InsertByte- 00:07:26.015 [2024-07-21 11:30:55.230691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.015 [2024-07-21 11:30:55.230721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.230791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.015 [2024-07-21 11:30:55.230813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.230927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.015 [2024-07-21 11:30:55.230948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.231072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:26.015 [2024-07-21 11:30:55.231095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.015 #38 NEW cov: 11822 ft: 15186 corp: 31/1783b lim: 90 exec/s: 38 rss: 70Mb L: 79/89 MS: 1 CrossOver- 00:07:26.015 [2024-07-21 11:30:55.291024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.015 [2024-07-21 11:30:55.291056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.291171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.015 [2024-07-21 11:30:55.291192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.291313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.015 [2024-07-21 11:30:55.291331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.015 #39 NEW cov: 11822 ft: 15374 corp: 32/1837b lim: 90 exec/s: 39 rss: 70Mb L: 54/89 MS: 1 CrossOver- 00:07:26.015 [2024-07-21 11:30:55.330522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.015 [2024-07-21 11:30:55.330555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.330668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.015 [2024-07-21 11:30:55.330689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.371285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.015 [2024-07-21 11:30:55.371317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.371407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.015 [2024-07-21 11:30:55.371426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.371553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.015 [2024-07-21 11:30:55.371573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.015 #41 NEW cov: 11822 ft: 15380 corp: 33/1903b lim: 90 exec/s: 41 rss: 70Mb L: 66/89 MS: 2 PersAutoDict-CopyPart- DE: "\000\000\000\000"- 00:07:26.015 [2024-07-21 11:30:55.411083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.015 [2024-07-21 11:30:55.411114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.411229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.015 [2024-07-21 11:30:55.411253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.015 [2024-07-21 11:30:55.411379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.015 [2024-07-21 11:30:55.411396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.274 #42 NEW cov: 11822 ft: 15413 corp: 34/1970b lim: 90 exec/s: 42 rss: 70Mb L: 67/89 MS: 1 InsertByte- 00:07:26.274 [2024-07-21 11:30:55.471330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.274 [2024-07-21 11:30:55.471360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.471494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.274 [2024-07-21 11:30:55.471517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.274 #43 NEW cov: 11822 ft: 15448 corp: 35/2013b lim: 90 exec/s: 43 rss: 70Mb L: 43/89 MS: 1 InsertByte- 00:07:26.274 [2024-07-21 11:30:55.511051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.274 [2024-07-21 11:30:55.511077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.511203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.274 [2024-07-21 11:30:55.511224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.274 #44 NEW cov: 11822 ft: 15458 corp: 36/2056b lim: 90 exec/s: 44 rss: 70Mb L: 43/89 MS: 1 CopyPart- 00:07:26.274 [2024-07-21 11:30:55.551107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.274 [2024-07-21 11:30:55.551133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.551261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.274 [2024-07-21 11:30:55.551285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.274 #45 NEW cov: 11822 ft: 15463 corp: 37/2099b lim: 90 exec/s: 45 rss: 70Mb L: 43/89 MS: 1 InsertByte- 00:07:26.274 [2024-07-21 11:30:55.591220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.274 [2024-07-21 11:30:55.591245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.274 #46 NEW cov: 11822 ft: 15469 corp: 38/2126b lim: 90 exec/s: 46 rss: 70Mb L: 27/89 MS: 1 EraseBytes- 00:07:26.274 [2024-07-21 11:30:55.632207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.274 [2024-07-21 11:30:55.632242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.632319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.274 [2024-07-21 11:30:55.632340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.632458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.274 [2024-07-21 11:30:55.632480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.632604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:26.274 [2024-07-21 11:30:55.632625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.274 #47 NEW cov: 11822 ft: 15522 corp: 39/2215b lim: 90 exec/s: 47 rss: 70Mb L: 89/89 MS: 1 ChangeBit- 00:07:26.274 [2024-07-21 11:30:55.671858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.274 [2024-07-21 11:30:55.671892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.274 [2024-07-21 11:30:55.672003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.274 [2024-07-21 11:30:55.672027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.274 #48 NEW cov: 11822 ft: 15529 corp: 40/2258b lim: 90 exec/s: 48 rss: 70Mb L: 43/89 MS: 1 ChangeByte- 00:07:26.534 [2024-07-21 11:30:55.712520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:26.534 [2024-07-21 11:30:55.712553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.534 [2024-07-21 11:30:55.712608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:26.534 [2024-07-21 11:30:55.712630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.534 [2024-07-21 11:30:55.712748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:26.534 [2024-07-21 11:30:55.712766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.534 [2024-07-21 11:30:55.712887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:26.534 [2024-07-21 11:30:55.712909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.534 #49 NEW cov: 11822 ft: 15536 corp: 41/2338b lim: 90 exec/s: 24 rss: 70Mb L: 80/89 MS: 1 InsertByte- 00:07:26.534 #49 DONE cov: 11822 ft: 15536 corp: 41/2338b lim: 90 exec/s: 24 rss: 70Mb 00:07:26.534 ###### Recommended dictionary. ###### 00:07:26.534 "\000\000\000\000" # Uses: 2 00:07:26.534 ###### End of recommended dictionary. ###### 00:07:26.534 Done 49 runs in 2 second(s) 00:07:26.534 11:30:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:07:26.534 11:30:55 -- ../common.sh@72 -- # (( i++ )) 00:07:26.534 11:30:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.534 11:30:55 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:26.534 11:30:55 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:26.534 11:30:55 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.534 11:30:55 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.534 11:30:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:26.534 11:30:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:26.534 11:30:55 -- nvmf/run.sh@29 -- # printf %02d 21 00:07:26.534 11:30:55 -- nvmf/run.sh@29 -- # port=4421 00:07:26.534 11:30:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:26.534 11:30:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:26.534 11:30:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.534 11:30:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:07:26.534 [2024-07-21 11:30:55.897302] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:26.534 [2024-07-21 11:30:55.897368] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2068641 ] 00:07:26.534 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.793 [2024-07-21 11:30:56.070192] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.793 [2024-07-21 11:30:56.089941] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.793 [2024-07-21 11:30:56.090064] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.793 [2024-07-21 11:30:56.141558] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:26.793 [2024-07-21 11:30:56.157876] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:26.793 INFO: Running with entropic power schedule (0xFF, 100). 00:07:26.793 INFO: Seed: 2362220672 00:07:26.793 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:26.793 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:26.793 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:26.793 INFO: A corpus is not provided, starting from an empty corpus 00:07:26.793 #2 INITED exec/s: 0 rss: 60Mb 00:07:26.793 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:26.793 This may also happen if the target rejected all inputs we tried so far 00:07:26.793 [2024-07-21 11:30:56.206861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:26.793 [2024-07-21 11:30:56.206892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.793 [2024-07-21 11:30:56.206947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:26.793 [2024-07-21 11:30:56.206979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 NEW_FUNC[1/672]: 0x4c3a20 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:27.312 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.312 #3 NEW cov: 11569 ft: 11570 corp: 2/26b lim: 50 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:27.312 [2024-07-21 11:30:56.517685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.312 [2024-07-21 11:30:56.517718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.517778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.312 [2024-07-21 11:30:56.517795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 #6 NEW cov: 11683 ft: 12114 corp: 3/53b lim: 50 exec/s: 0 rss: 68Mb L: 27/27 MS: 3 CMP-ShuffleBytes-InsertRepeatedBytes- DE: "\001\000\000\000"- 00:07:27.312 [2024-07-21 11:30:56.557837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.312 [2024-07-21 11:30:56.557865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.557899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.312 [2024-07-21 11:30:56.557914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.557970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.312 [2024-07-21 11:30:56.557987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.312 #7 NEW cov: 11689 ft: 12637 corp: 4/84b lim: 50 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:27.312 [2024-07-21 11:30:56.597788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.312 [2024-07-21 11:30:56.597817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.597864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.312 [2024-07-21 11:30:56.597880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 #8 NEW cov: 11774 ft: 12940 corp: 5/109b lim: 50 exec/s: 0 rss: 68Mb L: 25/31 MS: 1 ShuffleBytes- 00:07:27.312 [2024-07-21 11:30:56.637910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.312 [2024-07-21 11:30:56.637940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.637986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.312 [2024-07-21 11:30:56.638001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 #9 NEW cov: 11774 ft: 12983 corp: 6/134b lim: 50 exec/s: 0 rss: 68Mb L: 25/31 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:27.312 [2024-07-21 11:30:56.678074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.312 [2024-07-21 11:30:56.678104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.678153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.312 [2024-07-21 11:30:56.678169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 #10 NEW cov: 11774 ft: 13068 corp: 7/158b lim: 50 exec/s: 0 rss: 68Mb L: 24/31 MS: 1 EraseBytes- 00:07:27.312 [2024-07-21 11:30:56.718289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.312 [2024-07-21 11:30:56.718319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.718348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.312 [2024-07-21 11:30:56.718364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.312 [2024-07-21 11:30:56.718423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.312 [2024-07-21 11:30:56.718436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.571 #11 NEW cov: 11774 ft: 13118 corp: 8/189b lim: 50 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 ShuffleBytes- 00:07:27.571 [2024-07-21 11:30:56.758290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.758319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.758365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.758380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 #12 NEW cov: 11774 ft: 13206 corp: 9/215b lim: 50 exec/s: 0 rss: 69Mb L: 26/31 MS: 1 InsertByte- 00:07:27.571 [2024-07-21 11:30:56.798421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.798454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.798487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.798503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 #13 NEW cov: 11774 ft: 13251 corp: 10/240b lim: 50 exec/s: 0 rss: 69Mb L: 25/31 MS: 1 ShuffleBytes- 00:07:27.571 [2024-07-21 11:30:56.838691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.838721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.838781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.838799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.838855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.571 [2024-07-21 11:30:56.838871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.571 #14 NEW cov: 11774 ft: 13308 corp: 11/270b lim: 50 exec/s: 0 rss: 69Mb L: 30/31 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:27.571 [2024-07-21 11:30:56.878874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.878904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.878933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.878948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.879005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.571 [2024-07-21 11:30:56.879022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.571 #15 NEW cov: 11774 ft: 13352 corp: 12/301b lim: 50 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 ChangeBit- 00:07:27.571 [2024-07-21 11:30:56.918766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.918794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.918839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.918853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 #16 NEW cov: 11774 ft: 13433 corp: 13/328b lim: 50 exec/s: 0 rss: 69Mb L: 27/31 MS: 1 ChangeBit- 00:07:27.571 [2024-07-21 11:30:56.949048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.949077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.949110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.949125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.949181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.571 [2024-07-21 11:30:56.949197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.571 #17 NEW cov: 11774 ft: 13445 corp: 14/359b lim: 50 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 ChangeBit- 00:07:27.571 [2024-07-21 11:30:56.989260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.571 [2024-07-21 11:30:56.989289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.989326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.571 [2024-07-21 11:30:56.989341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.989399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.571 [2024-07-21 11:30:56.989420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.571 [2024-07-21 11:30:56.989476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:27.571 [2024-07-21 11:30:56.989492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.829 #18 NEW cov: 11774 ft: 13798 corp: 15/402b lim: 50 exec/s: 0 rss: 69Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:07:27.829 [2024-07-21 11:30:57.029081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.829 [2024-07-21 11:30:57.029109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.029153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.829 [2024-07-21 11:30:57.029169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.829 #19 NEW cov: 11774 ft: 13868 corp: 16/427b lim: 50 exec/s: 0 rss: 69Mb L: 25/43 MS: 1 ShuffleBytes- 00:07:27.829 [2024-07-21 11:30:57.069183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.829 [2024-07-21 11:30:57.069212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.069258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.829 [2024-07-21 11:30:57.069272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.829 #20 NEW cov: 11774 ft: 13979 corp: 17/454b lim: 50 exec/s: 0 rss: 69Mb L: 27/43 MS: 1 ChangeByte- 00:07:27.829 [2024-07-21 11:30:57.109455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.829 [2024-07-21 11:30:57.109480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.109500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.829 [2024-07-21 11:30:57.109511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.109528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.829 [2024-07-21 11:30:57.109539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.829 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:27.829 #21 NEW cov: 11807 ft: 14120 corp: 18/485b lim: 50 exec/s: 0 rss: 70Mb L: 31/43 MS: 1 ChangeBinInt- 00:07:27.829 [2024-07-21 11:30:57.149425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.829 [2024-07-21 11:30:57.149462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.149513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.829 [2024-07-21 11:30:57.149528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.829 #22 NEW cov: 11807 ft: 14126 corp: 19/510b lim: 50 exec/s: 0 rss: 70Mb L: 25/43 MS: 1 CrossOver- 00:07:27.829 [2024-07-21 11:30:57.189868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.829 [2024-07-21 11:30:57.189897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.189933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.829 [2024-07-21 11:30:57.189948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.190005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:27.829 [2024-07-21 11:30:57.190022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.190079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:27.829 [2024-07-21 11:30:57.190094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.829 #23 NEW cov: 11807 ft: 14141 corp: 20/553b lim: 50 exec/s: 23 rss: 70Mb L: 43/43 MS: 1 ChangeBinInt- 00:07:27.829 [2024-07-21 11:30:57.229652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:27.829 [2024-07-21 11:30:57.229680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.829 [2024-07-21 11:30:57.229725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:27.829 [2024-07-21 11:30:57.229740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.829 #24 NEW cov: 11807 ft: 14191 corp: 21/574b lim: 50 exec/s: 24 rss: 70Mb L: 21/43 MS: 1 CrossOver- 00:07:28.096 [2024-07-21 11:30:57.269715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.096 [2024-07-21 11:30:57.269743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.269780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.096 [2024-07-21 11:30:57.269796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.096 #25 NEW cov: 11807 ft: 14246 corp: 22/598b lim: 50 exec/s: 25 rss: 70Mb L: 24/43 MS: 1 CopyPart- 00:07:28.096 [2024-07-21 11:30:57.309885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.096 [2024-07-21 11:30:57.309913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.309954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.096 [2024-07-21 11:30:57.309969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.096 #26 NEW cov: 11807 ft: 14260 corp: 23/619b lim: 50 exec/s: 26 rss: 70Mb L: 21/43 MS: 1 CopyPart- 00:07:28.096 [2024-07-21 11:30:57.350003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.096 [2024-07-21 11:30:57.350032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.350087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.096 [2024-07-21 11:30:57.350103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.096 #27 NEW cov: 11807 ft: 14284 corp: 24/648b lim: 50 exec/s: 27 rss: 70Mb L: 29/43 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:28.096 [2024-07-21 11:30:57.390268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.096 [2024-07-21 11:30:57.390296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.390339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.096 [2024-07-21 11:30:57.390359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.390416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.096 [2024-07-21 11:30:57.390432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.096 #28 NEW cov: 11807 ft: 14364 corp: 25/679b lim: 50 exec/s: 28 rss: 70Mb L: 31/43 MS: 1 InsertRepeatedBytes- 00:07:28.096 [2024-07-21 11:30:57.430251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.096 [2024-07-21 11:30:57.430278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.430323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.096 [2024-07-21 11:30:57.430339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.096 #29 NEW cov: 11807 ft: 14369 corp: 26/706b lim: 50 exec/s: 29 rss: 70Mb L: 27/43 MS: 1 ShuffleBytes- 00:07:28.096 [2024-07-21 11:30:57.470372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.096 [2024-07-21 11:30:57.470401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.096 [2024-07-21 11:30:57.470449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.096 [2024-07-21 11:30:57.470466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.096 #30 NEW cov: 11807 ft: 14441 corp: 27/732b lim: 50 exec/s: 30 rss: 70Mb L: 26/43 MS: 1 ChangeBit- 00:07:28.096 [2024-07-21 11:30:57.510513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.097 [2024-07-21 11:30:57.510543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.097 [2024-07-21 11:30:57.510596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.097 [2024-07-21 11:30:57.510613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 #31 NEW cov: 11807 ft: 14448 corp: 28/759b lim: 50 exec/s: 31 rss: 70Mb L: 27/43 MS: 1 ChangeByte- 00:07:28.428 [2024-07-21 11:30:57.550602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.550630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.550686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.550702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 #32 NEW cov: 11807 ft: 14452 corp: 29/786b lim: 50 exec/s: 32 rss: 70Mb L: 27/43 MS: 1 ChangeBit- 00:07:28.428 [2024-07-21 11:30:57.591002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.591030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.591075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.591090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.591147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.428 [2024-07-21 11:30:57.591167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.591222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:28.428 [2024-07-21 11:30:57.591238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.428 #33 NEW cov: 11807 ft: 14467 corp: 30/826b lim: 50 exec/s: 33 rss: 70Mb L: 40/43 MS: 1 CrossOver- 00:07:28.428 [2024-07-21 11:30:57.630661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.630689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 #34 NEW cov: 11807 ft: 15229 corp: 31/836b lim: 50 exec/s: 34 rss: 70Mb L: 10/43 MS: 1 CrossOver- 00:07:28.428 [2024-07-21 11:30:57.681284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.681313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.681353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.681369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.681426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.428 [2024-07-21 11:30:57.681446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.681503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:28.428 [2024-07-21 11:30:57.681519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.428 #35 NEW cov: 11807 ft: 15237 corp: 32/884b lim: 50 exec/s: 35 rss: 70Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:07:28.428 [2024-07-21 11:30:57.721110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.721138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.721181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.721196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 #36 NEW cov: 11807 ft: 15251 corp: 33/909b lim: 50 exec/s: 36 rss: 70Mb L: 25/48 MS: 1 CopyPart- 00:07:28.428 [2024-07-21 11:30:57.761180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.761208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.761239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.761254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 #37 NEW cov: 11807 ft: 15257 corp: 34/934b lim: 50 exec/s: 37 rss: 70Mb L: 25/48 MS: 1 CrossOver- 00:07:28.428 [2024-07-21 11:30:57.791301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.791329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.791380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.791400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 #38 NEW cov: 11807 ft: 15269 corp: 35/961b lim: 50 exec/s: 38 rss: 70Mb L: 27/48 MS: 1 ChangeByte- 00:07:28.428 [2024-07-21 11:30:57.831599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.428 [2024-07-21 11:30:57.831629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.831670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.428 [2024-07-21 11:30:57.831686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.428 [2024-07-21 11:30:57.831748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.428 [2024-07-21 11:30:57.831765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.686 #39 NEW cov: 11807 ft: 15315 corp: 36/992b lim: 50 exec/s: 39 rss: 70Mb L: 31/48 MS: 1 InsertByte- 00:07:28.686 [2024-07-21 11:30:57.871515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.686 [2024-07-21 11:30:57.871542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.686 [2024-07-21 11:30:57.871577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.686 [2024-07-21 11:30:57.871593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.686 #40 NEW cov: 11807 ft: 15328 corp: 37/1018b lim: 50 exec/s: 40 rss: 70Mb L: 26/48 MS: 1 CopyPart- 00:07:28.686 [2024-07-21 11:30:57.911806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.686 [2024-07-21 11:30:57.911834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.686 [2024-07-21 11:30:57.911869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.686 [2024-07-21 11:30:57.911884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.686 [2024-07-21 11:30:57.911941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.686 [2024-07-21 11:30:57.911957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.686 #41 NEW cov: 11807 ft: 15334 corp: 38/1049b lim: 50 exec/s: 41 rss: 70Mb L: 31/48 MS: 1 ChangeBit- 00:07:28.686 [2024-07-21 11:30:57.951785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.686 [2024-07-21 11:30:57.951812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.686 [2024-07-21 11:30:57.951863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.686 [2024-07-21 11:30:57.951880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.686 #42 NEW cov: 11807 ft: 15343 corp: 39/1070b lim: 50 exec/s: 42 rss: 70Mb L: 21/48 MS: 1 ChangeByte- 00:07:28.686 [2024-07-21 11:30:57.992041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.686 [2024-07-21 11:30:57.992068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.687 [2024-07-21 11:30:57.992104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.687 [2024-07-21 11:30:57.992123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.687 [2024-07-21 11:30:57.992180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.687 [2024-07-21 11:30:57.992195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.687 #43 NEW cov: 11807 ft: 15350 corp: 40/1103b lim: 50 exec/s: 43 rss: 70Mb L: 33/48 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:28.687 [2024-07-21 11:30:58.032316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.687 [2024-07-21 11:30:58.032344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.687 [2024-07-21 11:30:58.032381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.687 [2024-07-21 11:30:58.032397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.687 [2024-07-21 11:30:58.032454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.687 [2024-07-21 11:30:58.032471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.687 [2024-07-21 11:30:58.032530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:28.687 [2024-07-21 11:30:58.032545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.687 #44 NEW cov: 11807 ft: 15356 corp: 41/1147b lim: 50 exec/s: 44 rss: 70Mb L: 44/48 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:28.687 [2024-07-21 11:30:58.072178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.687 [2024-07-21 11:30:58.072205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.687 [2024-07-21 11:30:58.072250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.687 [2024-07-21 11:30:58.072266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.687 #45 NEW cov: 11807 ft: 15366 corp: 42/1172b lim: 50 exec/s: 45 rss: 70Mb L: 25/48 MS: 1 EraseBytes- 00:07:28.945 [2024-07-21 11:30:58.112664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.945 [2024-07-21 11:30:58.112691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.945 [2024-07-21 11:30:58.112730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.945 [2024-07-21 11:30:58.112746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.945 [2024-07-21 11:30:58.112805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.945 [2024-07-21 11:30:58.112821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.945 [2024-07-21 11:30:58.112879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:28.945 [2024-07-21 11:30:58.112895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.945 #46 NEW cov: 11807 ft: 15381 corp: 43/1214b lim: 50 exec/s: 46 rss: 70Mb L: 42/48 MS: 1 CrossOver- 00:07:28.945 [2024-07-21 11:30:58.152360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.945 [2024-07-21 11:30:58.152389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.946 [2024-07-21 11:30:58.152454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.946 [2024-07-21 11:30:58.152470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.946 #47 NEW cov: 11807 ft: 15399 corp: 44/1241b lim: 50 exec/s: 47 rss: 70Mb L: 27/48 MS: 1 ChangeByte- 00:07:28.946 [2024-07-21 11:30:58.192830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:28.946 [2024-07-21 11:30:58.192860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.946 [2024-07-21 11:30:58.192898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:28.946 [2024-07-21 11:30:58.192913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.946 [2024-07-21 11:30:58.192971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:28.946 [2024-07-21 11:30:58.192988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.946 [2024-07-21 11:30:58.193046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:28.946 [2024-07-21 11:30:58.193062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.946 #48 NEW cov: 11807 ft: 15414 corp: 45/1284b lim: 50 exec/s: 24 rss: 70Mb L: 43/48 MS: 1 ChangeBit- 00:07:28.946 #48 DONE cov: 11807 ft: 15414 corp: 45/1284b lim: 50 exec/s: 24 rss: 70Mb 00:07:28.946 ###### Recommended dictionary. ###### 00:07:28.946 "\001\000\000\000" # Uses: 5 00:07:28.946 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:28.946 ###### End of recommended dictionary. ###### 00:07:28.946 Done 48 runs in 2 second(s) 00:07:28.946 11:30:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:07:28.946 11:30:58 -- ../common.sh@72 -- # (( i++ )) 00:07:28.946 11:30:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.946 11:30:58 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:07:28.946 11:30:58 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:07:28.946 11:30:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.946 11:30:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.946 11:30:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:28.946 11:30:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:07:28.946 11:30:58 -- nvmf/run.sh@29 -- # printf %02d 22 00:07:28.946 11:30:58 -- nvmf/run.sh@29 -- # port=4422 00:07:28.946 11:30:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:28.946 11:30:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:07:28.946 11:30:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.946 11:30:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:07:29.204 [2024-07-21 11:30:58.376879] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:29.204 [2024-07-21 11:30:58.376951] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2069026 ] 00:07:29.204 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.204 [2024-07-21 11:30:58.554447] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.204 [2024-07-21 11:30:58.574429] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.204 [2024-07-21 11:30:58.574559] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.204 [2024-07-21 11:30:58.626240] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.463 [2024-07-21 11:30:58.642591] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:07:29.463 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.463 INFO: Seed: 552244102 00:07:29.463 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:29.463 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:29.463 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:29.463 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.463 #2 INITED exec/s: 0 rss: 60Mb 00:07:29.463 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:29.463 This may also happen if the target rejected all inputs we tried so far 00:07:29.463 [2024-07-21 11:30:58.719083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.463 [2024-07-21 11:30:58.719127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.463 [2024-07-21 11:30:58.719240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.463 [2024-07-21 11:30:58.719264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.463 [2024-07-21 11:30:58.719378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:29.463 [2024-07-21 11:30:58.719398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.463 [2024-07-21 11:30:58.719517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:29.463 [2024-07-21 11:30:58.719531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.720 NEW_FUNC[1/672]: 0x4c5ce0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:07:29.720 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:29.720 #4 NEW cov: 11596 ft: 11590 corp: 2/75b lim: 85 exec/s: 0 rss: 68Mb L: 74/74 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:29.720 [2024-07-21 11:30:59.049513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.720 [2024-07-21 11:30:59.049565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.720 [2024-07-21 11:30:59.049696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.720 [2024-07-21 11:30:59.049723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.720 #5 NEW cov: 11709 ft: 12709 corp: 3/125b lim: 85 exec/s: 0 rss: 68Mb L: 50/74 MS: 1 InsertRepeatedBytes- 00:07:29.720 [2024-07-21 11:30:59.089208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.720 [2024-07-21 11:30:59.089233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.720 #6 NEW cov: 11715 ft: 13751 corp: 4/148b lim: 85 exec/s: 0 rss: 68Mb L: 23/74 MS: 1 InsertRepeatedBytes- 00:07:29.720 [2024-07-21 11:30:59.140153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.720 [2024-07-21 11:30:59.140183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.720 [2024-07-21 11:30:59.140259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.720 [2024-07-21 11:30:59.140281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.720 [2024-07-21 11:30:59.140399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:29.720 [2024-07-21 11:30:59.140419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.720 [2024-07-21 11:30:59.140547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:29.720 [2024-07-21 11:30:59.140578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.978 #7 NEW cov: 11800 ft: 14047 corp: 5/232b lim: 85 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 CrossOver- 00:07:29.978 [2024-07-21 11:30:59.199588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.978 [2024-07-21 11:30:59.199613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.978 #11 NEW cov: 11800 ft: 14246 corp: 6/258b lim: 85 exec/s: 0 rss: 68Mb L: 26/84 MS: 4 ShuffleBytes-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:29.978 [2024-07-21 11:30:59.250313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.978 [2024-07-21 11:30:59.250340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.250404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.978 [2024-07-21 11:30:59.250429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.250549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:29.978 [2024-07-21 11:30:59.250568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.978 #12 NEW cov: 11800 ft: 14585 corp: 7/319b lim: 85 exec/s: 0 rss: 68Mb L: 61/84 MS: 1 InsertRepeatedBytes- 00:07:29.978 [2024-07-21 11:30:59.300124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.978 [2024-07-21 11:30:59.300154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.300276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.978 [2024-07-21 11:30:59.300299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.978 #18 NEW cov: 11800 ft: 14700 corp: 8/364b lim: 85 exec/s: 0 rss: 69Mb L: 45/84 MS: 1 EraseBytes- 00:07:29.978 [2024-07-21 11:30:59.340533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.978 [2024-07-21 11:30:59.340565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.340677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.978 [2024-07-21 11:30:59.340701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.340827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:29.978 [2024-07-21 11:30:59.340850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.340971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:29.978 [2024-07-21 11:30:59.340989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.978 #19 NEW cov: 11800 ft: 14765 corp: 9/448b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 CrossOver- 00:07:29.978 [2024-07-21 11:30:59.380128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:29.978 [2024-07-21 11:30:59.380161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.978 [2024-07-21 11:30:59.380273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:29.978 [2024-07-21 11:30:59.380297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.235 #20 NEW cov: 11800 ft: 14803 corp: 10/492b lim: 85 exec/s: 0 rss: 69Mb L: 44/84 MS: 1 EraseBytes- 00:07:30.235 [2024-07-21 11:30:59.420474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.235 [2024-07-21 11:30:59.420507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.235 [2024-07-21 11:30:59.420568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.235 [2024-07-21 11:30:59.420585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.235 [2024-07-21 11:30:59.420705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.236 [2024-07-21 11:30:59.420726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.420844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.236 [2024-07-21 11:30:59.420862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.236 #21 NEW cov: 11800 ft: 14858 corp: 11/576b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 CrossOver- 00:07:30.236 [2024-07-21 11:30:59.471253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.236 [2024-07-21 11:30:59.471284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.471411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.236 [2024-07-21 11:30:59.471434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.471567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.236 [2024-07-21 11:30:59.471587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.471717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.236 [2024-07-21 11:30:59.471737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.236 #22 NEW cov: 11800 ft: 14877 corp: 12/658b lim: 85 exec/s: 0 rss: 69Mb L: 82/84 MS: 1 CrossOver- 00:07:30.236 [2024-07-21 11:30:59.511096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.236 [2024-07-21 11:30:59.511130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.511257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.236 [2024-07-21 11:30:59.511280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.511408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.236 [2024-07-21 11:30:59.511432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.236 #23 NEW cov: 11800 ft: 14956 corp: 13/717b lim: 85 exec/s: 0 rss: 69Mb L: 59/84 MS: 1 CrossOver- 00:07:30.236 [2024-07-21 11:30:59.551394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.236 [2024-07-21 11:30:59.551424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.551536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.236 [2024-07-21 11:30:59.551566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.551685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.236 [2024-07-21 11:30:59.551702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.551832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.236 [2024-07-21 11:30:59.551852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.236 #24 NEW cov: 11800 ft: 14976 corp: 14/796b lim: 85 exec/s: 0 rss: 69Mb L: 79/84 MS: 1 InsertRepeatedBytes- 00:07:30.236 [2024-07-21 11:30:59.601528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.236 [2024-07-21 11:30:59.601560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.601627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.236 [2024-07-21 11:30:59.601651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.601764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.236 [2024-07-21 11:30:59.601785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.236 [2024-07-21 11:30:59.601915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.236 [2024-07-21 11:30:59.601934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.236 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:30.236 #25 NEW cov: 11823 ft: 15035 corp: 15/880b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 ChangeBinInt- 00:07:30.493 [2024-07-21 11:30:59.661212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.493 [2024-07-21 11:30:59.661247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.661358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.493 [2024-07-21 11:30:59.661384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.493 #26 NEW cov: 11823 ft: 15128 corp: 16/930b lim: 85 exec/s: 26 rss: 69Mb L: 50/84 MS: 1 ChangeBinInt- 00:07:30.493 [2024-07-21 11:30:59.711466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.493 [2024-07-21 11:30:59.711500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.711618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.493 [2024-07-21 11:30:59.711646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.493 #27 NEW cov: 11823 ft: 15145 corp: 17/975b lim: 85 exec/s: 27 rss: 69Mb L: 45/84 MS: 1 InsertByte- 00:07:30.493 [2024-07-21 11:30:59.772192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.493 [2024-07-21 11:30:59.772231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.772305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.493 [2024-07-21 11:30:59.772329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.772458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.493 [2024-07-21 11:30:59.772484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.772615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.493 [2024-07-21 11:30:59.772635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.493 #28 NEW cov: 11823 ft: 15152 corp: 18/1054b lim: 85 exec/s: 28 rss: 70Mb L: 79/84 MS: 1 ChangeBinInt- 00:07:30.493 [2024-07-21 11:30:59.832103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.493 [2024-07-21 11:30:59.832138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.832271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.493 [2024-07-21 11:30:59.832297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.832425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.493 [2024-07-21 11:30:59.832450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.493 #29 NEW cov: 11823 ft: 15169 corp: 19/1105b lim: 85 exec/s: 29 rss: 70Mb L: 51/84 MS: 1 CrossOver- 00:07:30.493 [2024-07-21 11:30:59.892297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.493 [2024-07-21 11:30:59.892325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.892454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.493 [2024-07-21 11:30:59.892475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.493 [2024-07-21 11:30:59.892609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.493 [2024-07-21 11:30:59.892633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.751 #30 NEW cov: 11823 ft: 15205 corp: 20/1156b lim: 85 exec/s: 30 rss: 70Mb L: 51/84 MS: 1 ChangeBit- 00:07:30.751 [2024-07-21 11:30:59.952403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.751 [2024-07-21 11:30:59.952440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:30:59.952570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.751 [2024-07-21 11:30:59.952590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:30:59.952716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.751 [2024-07-21 11:30:59.952736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.751 #31 NEW cov: 11823 ft: 15211 corp: 21/1217b lim: 85 exec/s: 31 rss: 70Mb L: 61/84 MS: 1 CopyPart- 00:07:30.751 [2024-07-21 11:31:00.002793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.751 [2024-07-21 11:31:00.002828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.002923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.751 [2024-07-21 11:31:00.002947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.003076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.751 [2024-07-21 11:31:00.003098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.003223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.751 [2024-07-21 11:31:00.003246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.751 #32 NEW cov: 11823 ft: 15219 corp: 22/1301b lim: 85 exec/s: 32 rss: 70Mb L: 84/84 MS: 1 ChangeBinInt- 00:07:30.751 [2024-07-21 11:31:00.053238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.751 [2024-07-21 11:31:00.053272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.053337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.751 [2024-07-21 11:31:00.053363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.053453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.751 [2024-07-21 11:31:00.053464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.053482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:30.751 [2024-07-21 11:31:00.053492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.053617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:30.751 [2024-07-21 11:31:00.053641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:30.751 #33 NEW cov: 11823 ft: 15382 corp: 23/1386b lim: 85 exec/s: 33 rss: 70Mb L: 85/85 MS: 1 InsertByte- 00:07:30.751 [2024-07-21 11:31:00.112649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.751 [2024-07-21 11:31:00.112681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.112794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.751 [2024-07-21 11:31:00.112815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.751 #34 NEW cov: 11823 ft: 15452 corp: 24/1422b lim: 85 exec/s: 34 rss: 70Mb L: 36/85 MS: 1 CrossOver- 00:07:30.751 [2024-07-21 11:31:00.163128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:30.751 [2024-07-21 11:31:00.163165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.163265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:30.751 [2024-07-21 11:31:00.163288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.751 [2024-07-21 11:31:00.163408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:30.751 [2024-07-21 11:31:00.163430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.009 #35 NEW cov: 11823 ft: 15519 corp: 25/1481b lim: 85 exec/s: 35 rss: 70Mb L: 59/85 MS: 1 ChangeBinInt- 00:07:31.009 [2024-07-21 11:31:00.222751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.009 [2024-07-21 11:31:00.222783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.009 #36 NEW cov: 11823 ft: 15534 corp: 26/1511b lim: 85 exec/s: 36 rss: 70Mb L: 30/85 MS: 1 EraseBytes- 00:07:31.009 [2024-07-21 11:31:00.273390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.009 [2024-07-21 11:31:00.273421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.273548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.009 [2024-07-21 11:31:00.273571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.273690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.009 [2024-07-21 11:31:00.273713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.009 #37 NEW cov: 11823 ft: 15608 corp: 27/1571b lim: 85 exec/s: 37 rss: 70Mb L: 60/85 MS: 1 InsertByte- 00:07:31.009 [2024-07-21 11:31:00.323860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.009 [2024-07-21 11:31:00.323893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.323978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.009 [2024-07-21 11:31:00.324001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.324131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.009 [2024-07-21 11:31:00.324151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.324275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:31.009 [2024-07-21 11:31:00.324299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.009 #38 NEW cov: 11823 ft: 15616 corp: 28/1645b lim: 85 exec/s: 38 rss: 70Mb L: 74/85 MS: 1 CrossOver- 00:07:31.009 [2024-07-21 11:31:00.373749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.009 [2024-07-21 11:31:00.373780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.373875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.009 [2024-07-21 11:31:00.373895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.374021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.009 [2024-07-21 11:31:00.374045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.009 #39 NEW cov: 11823 ft: 15626 corp: 29/1707b lim: 85 exec/s: 39 rss: 70Mb L: 62/85 MS: 1 InsertByte- 00:07:31.009 [2024-07-21 11:31:00.413645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.009 [2024-07-21 11:31:00.413674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.413741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.009 [2024-07-21 11:31:00.413764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.413878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.009 [2024-07-21 11:31:00.413901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.009 [2024-07-21 11:31:00.414018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:31.009 [2024-07-21 11:31:00.414036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.009 #40 NEW cov: 11823 ft: 15658 corp: 30/1791b lim: 85 exec/s: 40 rss: 70Mb L: 84/85 MS: 1 CopyPart- 00:07:31.267 [2024-07-21 11:31:00.454234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.267 [2024-07-21 11:31:00.454267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.454351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.267 [2024-07-21 11:31:00.454372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.454492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.267 [2024-07-21 11:31:00.454514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.454630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:31.267 [2024-07-21 11:31:00.454649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.267 #41 NEW cov: 11823 ft: 15664 corp: 31/1875b lim: 85 exec/s: 41 rss: 70Mb L: 84/85 MS: 1 ShuffleBytes- 00:07:31.267 [2024-07-21 11:31:00.504366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.267 [2024-07-21 11:31:00.504398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.504471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.267 [2024-07-21 11:31:00.504491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.504621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.267 [2024-07-21 11:31:00.504642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.504767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:31.267 [2024-07-21 11:31:00.504790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.267 #42 NEW cov: 11823 ft: 15694 corp: 32/1955b lim: 85 exec/s: 42 rss: 70Mb L: 80/85 MS: 1 InsertByte- 00:07:31.267 [2024-07-21 11:31:00.544076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.267 [2024-07-21 11:31:00.544109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.544173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.267 [2024-07-21 11:31:00.544192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.544323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.267 [2024-07-21 11:31:00.544344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.544464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:31.267 [2024-07-21 11:31:00.544488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.267 #43 NEW cov: 11823 ft: 15709 corp: 33/2037b lim: 85 exec/s: 43 rss: 70Mb L: 82/85 MS: 1 CMP- DE: "\360W0\255\206\217/\000"- 00:07:31.267 [2024-07-21 11:31:00.584063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.267 [2024-07-21 11:31:00.584095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.584211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.267 [2024-07-21 11:31:00.584231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.267 #44 NEW cov: 11823 ft: 15710 corp: 34/2071b lim: 85 exec/s: 44 rss: 70Mb L: 34/85 MS: 1 CrossOver- 00:07:31.267 [2024-07-21 11:31:00.634479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.267 [2024-07-21 11:31:00.634515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.634637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.267 [2024-07-21 11:31:00.634659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.634805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.267 [2024-07-21 11:31:00.634826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.267 #45 NEW cov: 11823 ft: 15723 corp: 35/2127b lim: 85 exec/s: 45 rss: 70Mb L: 56/85 MS: 1 CrossOver- 00:07:31.267 [2024-07-21 11:31:00.684747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:31.267 [2024-07-21 11:31:00.684779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.684894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:31.267 [2024-07-21 11:31:00.684917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.267 [2024-07-21 11:31:00.685052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:31.267 [2024-07-21 11:31:00.685075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.526 #46 NEW cov: 11823 ft: 15772 corp: 36/2188b lim: 85 exec/s: 23 rss: 70Mb L: 61/85 MS: 1 ChangeByte- 00:07:31.526 #46 DONE cov: 11823 ft: 15772 corp: 36/2188b lim: 85 exec/s: 23 rss: 70Mb 00:07:31.526 ###### Recommended dictionary. ###### 00:07:31.526 "\360W0\255\206\217/\000" # Uses: 0 00:07:31.526 ###### End of recommended dictionary. ###### 00:07:31.526 Done 46 runs in 2 second(s) 00:07:31.526 11:31:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:07:31.526 11:31:00 -- ../common.sh@72 -- # (( i++ )) 00:07:31.526 11:31:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:31.526 11:31:00 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:07:31.526 11:31:00 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:07:31.526 11:31:00 -- nvmf/run.sh@24 -- # local timen=1 00:07:31.526 11:31:00 -- nvmf/run.sh@25 -- # local core=0x1 00:07:31.526 11:31:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:31.526 11:31:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:07:31.526 11:31:00 -- nvmf/run.sh@29 -- # printf %02d 23 00:07:31.526 11:31:00 -- nvmf/run.sh@29 -- # port=4423 00:07:31.526 11:31:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:31.526 11:31:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:07:31.526 11:31:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:31.526 11:31:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:07:31.526 [2024-07-21 11:31:00.873045] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:31.526 [2024-07-21 11:31:00.873141] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2069469 ] 00:07:31.526 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.785 [2024-07-21 11:31:01.049279] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.785 [2024-07-21 11:31:01.068718] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.785 [2024-07-21 11:31:01.068841] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.785 [2024-07-21 11:31:01.120430] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:31.785 [2024-07-21 11:31:01.136737] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:07:31.785 INFO: Running with entropic power schedule (0xFF, 100). 00:07:31.785 INFO: Seed: 3047225042 00:07:31.785 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:31.785 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:31.785 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:31.785 INFO: A corpus is not provided, starting from an empty corpus 00:07:31.785 #2 INITED exec/s: 0 rss: 60Mb 00:07:31.785 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:31.785 This may also happen if the target rejected all inputs we tried so far 00:07:31.785 [2024-07-21 11:31:01.181856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:31.785 [2024-07-21 11:31:01.181887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.785 [2024-07-21 11:31:01.181942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:31.785 [2024-07-21 11:31:01.181958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.302 NEW_FUNC[1/666]: 0x4c8f10 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:07:32.302 NEW_FUNC[2/666]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:32.302 #24 NEW cov: 11485 ft: 11486 corp: 2/11b lim: 25 exec/s: 0 rss: 68Mb L: 10/10 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:32.302 [2024-07-21 11:31:01.492715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.492751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 [2024-07-21 11:31:01.492809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.302 [2024-07-21 11:31:01.492824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.302 NEW_FUNC[1/5]: 0x16decf0 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:07:32.302 NEW_FUNC[2/5]: 0x17402a0 in nvme_transport_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:606 00:07:32.302 #33 NEW cov: 11642 ft: 12008 corp: 3/22b lim: 25 exec/s: 0 rss: 68Mb L: 11/11 MS: 4 CopyPart-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:32.302 [2024-07-21 11:31:01.532734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.532765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 [2024-07-21 11:31:01.532822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.302 [2024-07-21 11:31:01.532839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.302 #34 NEW cov: 11648 ft: 12317 corp: 4/33b lim: 25 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeBit- 00:07:32.302 [2024-07-21 11:31:01.572727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.572755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 #35 NEW cov: 11733 ft: 12975 corp: 5/42b lim: 25 exec/s: 0 rss: 68Mb L: 9/11 MS: 1 EraseBytes- 00:07:32.302 [2024-07-21 11:31:01.613003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.613031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 [2024-07-21 11:31:01.613085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.302 [2024-07-21 11:31:01.613100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.302 #36 NEW cov: 11733 ft: 13047 corp: 6/53b lim: 25 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeByte- 00:07:32.302 [2024-07-21 11:31:01.643074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.643101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 [2024-07-21 11:31:01.643153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.302 [2024-07-21 11:31:01.643169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.302 #37 NEW cov: 11733 ft: 13149 corp: 7/64b lim: 25 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 InsertByte- 00:07:32.302 [2024-07-21 11:31:01.683207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.683236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 [2024-07-21 11:31:01.683288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.302 [2024-07-21 11:31:01.683304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.302 #38 NEW cov: 11733 ft: 13237 corp: 8/74b lim: 25 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 ShuffleBytes- 00:07:32.302 [2024-07-21 11:31:01.723353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.302 [2024-07-21 11:31:01.723381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.302 [2024-07-21 11:31:01.723423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.302 [2024-07-21 11:31:01.723439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.560 #39 NEW cov: 11733 ft: 13306 corp: 9/86b lim: 25 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 InsertByte- 00:07:32.560 [2024-07-21 11:31:01.763423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.560 [2024-07-21 11:31:01.763456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.560 [2024-07-21 11:31:01.763488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.560 [2024-07-21 11:31:01.763503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.560 #40 NEW cov: 11733 ft: 13391 corp: 10/96b lim: 25 exec/s: 0 rss: 69Mb L: 10/12 MS: 1 ChangeBit- 00:07:32.560 [2024-07-21 11:31:01.803593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.560 [2024-07-21 11:31:01.803621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.560 [2024-07-21 11:31:01.803668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.560 [2024-07-21 11:31:01.803684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.560 #41 NEW cov: 11733 ft: 13477 corp: 11/107b lim: 25 exec/s: 0 rss: 69Mb L: 11/12 MS: 1 ShuffleBytes- 00:07:32.560 [2024-07-21 11:31:01.843651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.560 [2024-07-21 11:31:01.843679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.560 [2024-07-21 11:31:01.843724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.560 [2024-07-21 11:31:01.843739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.560 #42 NEW cov: 11733 ft: 13495 corp: 12/121b lim: 25 exec/s: 0 rss: 69Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:07:32.560 [2024-07-21 11:31:01.883692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.560 [2024-07-21 11:31:01.883719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.560 #43 NEW cov: 11733 ft: 13583 corp: 13/128b lim: 25 exec/s: 0 rss: 69Mb L: 7/14 MS: 1 EraseBytes- 00:07:32.560 [2024-07-21 11:31:01.913876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.560 [2024-07-21 11:31:01.913904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.560 [2024-07-21 11:31:01.913948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.560 [2024-07-21 11:31:01.913963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.560 #44 NEW cov: 11733 ft: 13604 corp: 14/139b lim: 25 exec/s: 0 rss: 69Mb L: 11/14 MS: 1 ChangeByte- 00:07:32.560 [2024-07-21 11:31:01.953989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.560 [2024-07-21 11:31:01.954017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.560 [2024-07-21 11:31:01.954065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.560 [2024-07-21 11:31:01.954081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.560 #45 NEW cov: 11733 ft: 13648 corp: 15/152b lim: 25 exec/s: 0 rss: 69Mb L: 13/14 MS: 1 InsertRepeatedBytes- 00:07:32.818 [2024-07-21 11:31:01.994164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:01.994193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:01.994242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.818 [2024-07-21 11:31:01.994258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.818 #46 NEW cov: 11733 ft: 13675 corp: 16/162b lim: 25 exec/s: 0 rss: 69Mb L: 10/14 MS: 1 ShuffleBytes- 00:07:32.818 [2024-07-21 11:31:02.034363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:02.034391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.034429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.818 [2024-07-21 11:31:02.034448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.034506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:32.818 [2024-07-21 11:31:02.034522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.818 #47 NEW cov: 11733 ft: 13948 corp: 17/177b lim: 25 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 CrossOver- 00:07:32.818 [2024-07-21 11:31:02.074349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:02.074378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.074435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.818 [2024-07-21 11:31:02.074455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.818 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:32.818 #48 NEW cov: 11756 ft: 14108 corp: 18/187b lim: 25 exec/s: 0 rss: 70Mb L: 10/15 MS: 1 EraseBytes- 00:07:32.818 [2024-07-21 11:31:02.114357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:02.114385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.818 #49 NEW cov: 11756 ft: 14149 corp: 19/193b lim: 25 exec/s: 0 rss: 70Mb L: 6/15 MS: 1 EraseBytes- 00:07:32.818 [2024-07-21 11:31:02.154608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:02.154635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.154683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.818 [2024-07-21 11:31:02.154698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.818 #50 NEW cov: 11756 ft: 14167 corp: 20/207b lim: 25 exec/s: 50 rss: 70Mb L: 14/15 MS: 1 CopyPart- 00:07:32.818 [2024-07-21 11:31:02.194877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:02.194905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.194950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:32.818 [2024-07-21 11:31:02.194966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.195023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:32.818 [2024-07-21 11:31:02.195038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.818 [2024-07-21 11:31:02.195095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:32.818 [2024-07-21 11:31:02.195110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.818 #51 NEW cov: 11756 ft: 14628 corp: 21/229b lim: 25 exec/s: 51 rss: 70Mb L: 22/22 MS: 1 CopyPart- 00:07:32.818 [2024-07-21 11:31:02.234683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:32.818 [2024-07-21 11:31:02.234711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 #52 NEW cov: 11756 ft: 14639 corp: 22/238b lim: 25 exec/s: 52 rss: 70Mb L: 9/22 MS: 1 ShuffleBytes- 00:07:33.076 [2024-07-21 11:31:02.274895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.274923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 [2024-07-21 11:31:02.274952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.076 [2024-07-21 11:31:02.274969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.076 #53 NEW cov: 11756 ft: 14668 corp: 23/248b lim: 25 exec/s: 53 rss: 70Mb L: 10/22 MS: 1 ChangeBit- 00:07:33.076 [2024-07-21 11:31:02.315158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.315186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 [2024-07-21 11:31:02.315214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.076 [2024-07-21 11:31:02.315230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.076 [2024-07-21 11:31:02.315285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.076 [2024-07-21 11:31:02.315302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.076 #54 NEW cov: 11756 ft: 14680 corp: 24/264b lim: 25 exec/s: 54 rss: 70Mb L: 16/22 MS: 1 CopyPart- 00:07:33.076 [2024-07-21 11:31:02.355029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.355057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 #55 NEW cov: 11756 ft: 14769 corp: 25/273b lim: 25 exec/s: 55 rss: 70Mb L: 9/22 MS: 1 ChangeByte- 00:07:33.076 [2024-07-21 11:31:02.385293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.385321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 [2024-07-21 11:31:02.385369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.076 [2024-07-21 11:31:02.385386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.076 #56 NEW cov: 11756 ft: 14806 corp: 26/283b lim: 25 exec/s: 56 rss: 70Mb L: 10/22 MS: 1 ShuffleBytes- 00:07:33.076 [2024-07-21 11:31:02.425375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.425404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 [2024-07-21 11:31:02.425464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.076 [2024-07-21 11:31:02.425481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.076 #57 NEW cov: 11756 ft: 14817 corp: 27/295b lim: 25 exec/s: 57 rss: 70Mb L: 12/22 MS: 1 InsertByte- 00:07:33.076 [2024-07-21 11:31:02.455338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.455366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.076 #58 NEW cov: 11756 ft: 14819 corp: 28/304b lim: 25 exec/s: 58 rss: 70Mb L: 9/22 MS: 1 ShuffleBytes- 00:07:33.076 [2024-07-21 11:31:02.495429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.076 [2024-07-21 11:31:02.495462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.334 #59 NEW cov: 11756 ft: 14839 corp: 29/311b lim: 25 exec/s: 59 rss: 70Mb L: 7/22 MS: 1 EraseBytes- 00:07:33.334 [2024-07-21 11:31:02.535944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.334 [2024-07-21 11:31:02.535972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.334 [2024-07-21 11:31:02.536018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.334 [2024-07-21 11:31:02.536035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.334 [2024-07-21 11:31:02.536093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.334 [2024-07-21 11:31:02.536110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.334 [2024-07-21 11:31:02.536165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:33.335 [2024-07-21 11:31:02.536178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.335 #60 NEW cov: 11756 ft: 14860 corp: 30/332b lim: 25 exec/s: 60 rss: 70Mb L: 21/22 MS: 1 CrossOver- 00:07:33.335 [2024-07-21 11:31:02.575888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.335 [2024-07-21 11:31:02.575914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.335 [2024-07-21 11:31:02.575949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.335 [2024-07-21 11:31:02.575965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.335 [2024-07-21 11:31:02.576020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.335 [2024-07-21 11:31:02.576039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.335 #61 NEW cov: 11756 ft: 14872 corp: 31/348b lim: 25 exec/s: 61 rss: 70Mb L: 16/22 MS: 1 InsertRepeatedBytes- 00:07:33.335 [2024-07-21 11:31:02.615884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.335 [2024-07-21 11:31:02.615913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.335 [2024-07-21 11:31:02.615942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.335 [2024-07-21 11:31:02.615958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.335 #62 NEW cov: 11756 ft: 14926 corp: 32/360b lim: 25 exec/s: 62 rss: 70Mb L: 12/22 MS: 1 ChangeByte- 00:07:33.335 [2024-07-21 11:31:02.655930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.335 [2024-07-21 11:31:02.655958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.335 #63 NEW cov: 11756 ft: 14963 corp: 33/369b lim: 25 exec/s: 63 rss: 70Mb L: 9/22 MS: 1 EraseBytes- 00:07:33.335 [2024-07-21 11:31:02.696194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.335 [2024-07-21 11:31:02.696222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.335 [2024-07-21 11:31:02.696251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.335 [2024-07-21 11:31:02.696267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.335 #64 NEW cov: 11756 ft: 15000 corp: 34/383b lim: 25 exec/s: 64 rss: 70Mb L: 14/22 MS: 1 CrossOver- 00:07:33.335 [2024-07-21 11:31:02.736266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.335 [2024-07-21 11:31:02.736293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.335 [2024-07-21 11:31:02.736337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.335 [2024-07-21 11:31:02.736352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.335 #65 NEW cov: 11756 ft: 15009 corp: 35/394b lim: 25 exec/s: 65 rss: 70Mb L: 11/22 MS: 1 InsertByte- 00:07:33.593 [2024-07-21 11:31:02.776522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.593 [2024-07-21 11:31:02.776550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.776585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.593 [2024-07-21 11:31:02.776601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.776659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.593 [2024-07-21 11:31:02.776675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.593 #66 NEW cov: 11756 ft: 15041 corp: 36/412b lim: 25 exec/s: 66 rss: 70Mb L: 18/22 MS: 1 CrossOver- 00:07:33.593 [2024-07-21 11:31:02.816723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.593 [2024-07-21 11:31:02.816752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.816790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.593 [2024-07-21 11:31:02.816806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.816863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.593 [2024-07-21 11:31:02.816880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.816936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:33.593 [2024-07-21 11:31:02.816950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.593 #67 NEW cov: 11756 ft: 15070 corp: 37/436b lim: 25 exec/s: 67 rss: 70Mb L: 24/24 MS: 1 CrossOver- 00:07:33.593 [2024-07-21 11:31:02.856619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.593 [2024-07-21 11:31:02.856647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.856690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.593 [2024-07-21 11:31:02.856705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.593 #68 NEW cov: 11756 ft: 15082 corp: 38/446b lim: 25 exec/s: 68 rss: 70Mb L: 10/24 MS: 1 ChangeBit- 00:07:33.593 [2024-07-21 11:31:02.896686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.593 [2024-07-21 11:31:02.896714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.896763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.593 [2024-07-21 11:31:02.896780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.593 #69 NEW cov: 11756 ft: 15087 corp: 39/459b lim: 25 exec/s: 69 rss: 70Mb L: 13/24 MS: 1 ChangeBinInt- 00:07:33.593 [2024-07-21 11:31:02.936704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.593 [2024-07-21 11:31:02.936732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.593 #70 NEW cov: 11756 ft: 15113 corp: 40/468b lim: 25 exec/s: 70 rss: 70Mb L: 9/24 MS: 1 ChangeBit- 00:07:33.593 [2024-07-21 11:31:02.976948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.593 [2024-07-21 11:31:02.976976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.593 [2024-07-21 11:31:02.977021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.593 [2024-07-21 11:31:02.977036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.593 #71 NEW cov: 11756 ft: 15121 corp: 41/479b lim: 25 exec/s: 71 rss: 70Mb L: 11/24 MS: 1 ShuffleBytes- 00:07:33.852 [2024-07-21 11:31:03.017087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.852 [2024-07-21 11:31:03.017116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.017161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.852 [2024-07-21 11:31:03.017176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.852 #72 NEW cov: 11756 ft: 15130 corp: 42/492b lim: 25 exec/s: 72 rss: 70Mb L: 13/24 MS: 1 InsertRepeatedBytes- 00:07:33.852 [2024-07-21 11:31:03.057354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.852 [2024-07-21 11:31:03.057383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.057410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.852 [2024-07-21 11:31:03.057426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.057486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.852 [2024-07-21 11:31:03.057503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.852 #73 NEW cov: 11756 ft: 15152 corp: 43/507b lim: 25 exec/s: 73 rss: 71Mb L: 15/24 MS: 1 ChangeByte- 00:07:33.852 [2024-07-21 11:31:03.097312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.852 [2024-07-21 11:31:03.097342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.097389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.852 [2024-07-21 11:31:03.097403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.852 #74 NEW cov: 11756 ft: 15222 corp: 44/520b lim: 25 exec/s: 74 rss: 71Mb L: 13/24 MS: 1 ChangeBit- 00:07:33.852 [2024-07-21 11:31:03.137663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.852 [2024-07-21 11:31:03.137692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.137729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.852 [2024-07-21 11:31:03.137744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.137800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:33.852 [2024-07-21 11:31:03.137816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.137870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:33.852 [2024-07-21 11:31:03.137885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.852 #75 NEW cov: 11756 ft: 15227 corp: 45/544b lim: 25 exec/s: 75 rss: 71Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:07:33.852 [2024-07-21 11:31:03.177556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:33.852 [2024-07-21 11:31:03.177584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.852 [2024-07-21 11:31:03.177631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:33.852 [2024-07-21 11:31:03.177646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.852 #76 NEW cov: 11756 ft: 15297 corp: 46/555b lim: 25 exec/s: 38 rss: 71Mb L: 11/24 MS: 1 CrossOver- 00:07:33.852 #76 DONE cov: 11756 ft: 15297 corp: 46/555b lim: 25 exec/s: 38 rss: 71Mb 00:07:33.852 Done 76 runs in 2 second(s) 00:07:34.110 11:31:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:07:34.110 11:31:03 -- ../common.sh@72 -- # (( i++ )) 00:07:34.110 11:31:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.110 11:31:03 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:07:34.110 11:31:03 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:07:34.110 11:31:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.110 11:31:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.110 11:31:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:34.110 11:31:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:07:34.110 11:31:03 -- nvmf/run.sh@29 -- # printf %02d 24 00:07:34.110 11:31:03 -- nvmf/run.sh@29 -- # port=4424 00:07:34.110 11:31:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:34.110 11:31:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:07:34.110 11:31:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.110 11:31:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:07:34.110 [2024-07-21 11:31:03.352847] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:34.110 [2024-07-21 11:31:03.352918] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2070015 ] 00:07:34.110 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.110 [2024-07-21 11:31:03.531999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.368 [2024-07-21 11:31:03.551804] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.368 [2024-07-21 11:31:03.551927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.368 [2024-07-21 11:31:03.603448] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.368 [2024-07-21 11:31:03.619801] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:07:34.368 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.368 INFO: Seed: 1235266088 00:07:34.368 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:34.368 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:34.368 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:34.368 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.368 #2 INITED exec/s: 0 rss: 60Mb 00:07:34.368 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.368 This may also happen if the target rejected all inputs we tried so far 00:07:34.368 [2024-07-21 11:31:03.664339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362859816943608 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.368 [2024-07-21 11:31:03.664376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.626 NEW_FUNC[1/672]: 0x4c9ff0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:07:34.626 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.626 #52 NEW cov: 11601 ft: 11602 corp: 2/39b lim: 100 exec/s: 0 rss: 68Mb L: 38/38 MS: 5 CMP-ChangeBit-ShuffleBytes-CrossOver-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377\377"- 00:07:34.626 [2024-07-21 11:31:03.985111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362859816943608 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.626 [2024-07-21 11:31:03.985153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.626 #58 NEW cov: 11714 ft: 11978 corp: 3/78b lim: 100 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertByte- 00:07:34.884 [2024-07-21 11:31:04.055160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940292491072765944 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.055193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.884 #59 NEW cov: 11720 ft: 12330 corp: 4/117b lim: 100 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ChangeBit- 00:07:34.884 [2024-07-21 11:31:04.115341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940292491072765944 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.115383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.884 #60 NEW cov: 11805 ft: 12784 corp: 5/156b lim: 100 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CrossOver- 00:07:34.884 [2024-07-21 11:31:04.175543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940292491072765944 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.175574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.884 [2024-07-21 11:31:04.175608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9941969925633014008 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.175625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.884 #61 NEW cov: 11805 ft: 13679 corp: 6/203b lim: 100 exec/s: 0 rss: 68Mb L: 47/47 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:34.884 [2024-07-21 11:31:04.235706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.235737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.884 [2024-07-21 11:31:04.235771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17940362862769273080 len:35321 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.235788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.884 #62 NEW cov: 11805 ft: 13753 corp: 7/256b lim: 100 exec/s: 0 rss: 68Mb L: 53/53 MS: 1 InsertRepeatedBytes- 00:07:34.884 [2024-07-21 11:31:04.305867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362859816943608 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.884 [2024-07-21 11:31:04.305901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.142 #63 NEW cov: 11805 ft: 13822 corp: 8/294b lim: 100 exec/s: 0 rss: 69Mb L: 38/53 MS: 1 ChangeBit- 00:07:35.142 [2024-07-21 11:31:04.356045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65529 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.142 [2024-07-21 11:31:04.356078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.143 [2024-07-21 11:31:04.356111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6872316419617283935 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.356129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.143 [2024-07-21 11:31:04.356158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17940362863843014904 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.356174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.143 #64 NEW cov: 11805 ft: 14223 corp: 9/355b lim: 100 exec/s: 0 rss: 69Mb L: 61/61 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:35.143 [2024-07-21 11:31:04.426141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16855260267597654505 len:59882 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.426173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.143 #66 NEW cov: 11805 ft: 14239 corp: 10/389b lim: 100 exec/s: 0 rss: 69Mb L: 34/61 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:35.143 [2024-07-21 11:31:04.476458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.476489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.143 [2024-07-21 11:31:04.476522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.476538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.143 [2024-07-21 11:31:04.476567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.476582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.143 [2024-07-21 11:31:04.476609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17940362861266057464 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.476625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.143 #67 NEW cov: 11805 ft: 14679 corp: 11/484b lim: 100 exec/s: 0 rss: 69Mb L: 95/95 MS: 1 InsertRepeatedBytes- 00:07:35.143 [2024-07-21 11:31:04.536495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.536527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.143 [2024-07-21 11:31:04.536560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17940362862769273080 len:35321 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.143 [2024-07-21 11:31:04.536577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.401 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.401 #73 NEW cov: 11822 ft: 14718 corp: 12/537b lim: 100 exec/s: 0 rss: 69Mb L: 53/95 MS: 1 CopyPart- 00:07:35.401 [2024-07-21 11:31:04.596610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362859816943608 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.596641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.401 #74 NEW cov: 11822 ft: 14778 corp: 13/575b lim: 100 exec/s: 0 rss: 69Mb L: 38/95 MS: 1 ChangeBit- 00:07:35.401 [2024-07-21 11:31:04.646708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16855260267597654505 len:59864 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.646741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.401 #75 NEW cov: 11822 ft: 14794 corp: 14/609b lim: 100 exec/s: 75 rss: 69Mb L: 34/95 MS: 1 ChangeByte- 00:07:35.401 [2024-07-21 11:31:04.706978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940292491072765944 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.707010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.401 [2024-07-21 11:31:04.707047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17940292495098837240 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.707064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.401 [2024-07-21 11:31:04.707093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17940362863843014904 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.707109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.401 #76 NEW cov: 11822 ft: 14818 corp: 15/682b lim: 100 exec/s: 76 rss: 69Mb L: 73/95 MS: 1 CrossOver- 00:07:35.401 [2024-07-21 11:31:04.767195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.767226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.401 [2024-07-21 11:31:04.767259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.767276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.401 [2024-07-21 11:31:04.767304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.767320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.401 [2024-07-21 11:31:04.767347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6872414935960256512 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.401 [2024-07-21 11:31:04.767362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.401 #77 NEW cov: 11822 ft: 14861 corp: 16/781b lim: 100 exec/s: 77 rss: 69Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:07:35.659 [2024-07-21 11:31:04.837235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940365058840199160 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:04.837266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.659 #78 NEW cov: 11822 ft: 14938 corp: 17/819b lim: 100 exec/s: 78 rss: 69Mb L: 38/99 MS: 1 ChangeBit- 00:07:35.659 [2024-07-21 11:31:04.887396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362205573218303 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:04.887426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.659 [2024-07-21 11:31:04.887466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17940362861271906552 len:63626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:04.887483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.659 #79 NEW cov: 11822 ft: 14993 corp: 18/873b lim: 100 exec/s: 79 rss: 69Mb L: 54/99 MS: 1 InsertByte- 00:07:35.659 [2024-07-21 11:31:04.947646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65529 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:04.947676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.659 [2024-07-21 11:31:04.947709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6872316419617283935 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:04.947732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.659 [2024-07-21 11:31:04.947762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17940362546015435000 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:04.947778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.659 #80 NEW cov: 11822 ft: 15029 corp: 19/934b lim: 100 exec/s: 80 rss: 69Mb L: 61/99 MS: 1 ChangeByte- 00:07:35.659 [2024-07-21 11:31:05.017883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:05.017915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.659 [2024-07-21 11:31:05.017947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:05.017964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.659 [2024-07-21 11:31:05.017992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:05.018008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.659 [2024-07-21 11:31:05.018035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6872414935960256512 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.659 [2024-07-21 11:31:05.018050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.659 #81 NEW cov: 11822 ft: 15069 corp: 20/1033b lim: 100 exec/s: 81 rss: 70Mb L: 99/99 MS: 1 CopyPart- 00:07:35.917 [2024-07-21 11:31:05.088009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940292491072765944 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.088040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.917 [2024-07-21 11:31:05.088072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17940292495098837240 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.088089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.917 [2024-07-21 11:31:05.088117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17940362863843014904 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.088133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.917 #82 NEW cov: 11822 ft: 15088 corp: 21/1106b lim: 100 exec/s: 82 rss: 70Mb L: 73/99 MS: 1 ShuffleBytes- 00:07:35.917 [2024-07-21 11:31:05.158238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.158270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.917 [2024-07-21 11:31:05.158302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.158319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.917 [2024-07-21 11:31:05.158347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.158367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.917 [2024-07-21 11:31:05.158394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17940362861266057464 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.158410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.917 #83 NEW cov: 11822 ft: 15093 corp: 22/1201b lim: 100 exec/s: 83 rss: 70Mb L: 95/99 MS: 1 ChangeBit- 00:07:35.917 [2024-07-21 11:31:05.208324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362859816943608 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.208354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.917 [2024-07-21 11:31:05.208386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:289360691352306692 len:1029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-07-21 11:31:05.208402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.918 [2024-07-21 11:31:05.208431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:289360691352306692 len:1029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.918 [2024-07-21 11:31:05.208455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.918 #84 NEW cov: 11822 ft: 15114 corp: 23/1278b lim: 100 exec/s: 84 rss: 70Mb L: 77/99 MS: 1 InsertRepeatedBytes- 00:07:35.918 [2024-07-21 11:31:05.268452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2531906049836000035 len:8996 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.918 [2024-07-21 11:31:05.268482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.918 [2024-07-21 11:31:05.268513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2531906049332683555 len:8996 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.918 [2024-07-21 11:31:05.268530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.918 [2024-07-21 11:31:05.268559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:2531906049332683555 len:8996 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.918 [2024-07-21 11:31:05.268574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.918 #89 NEW cov: 11822 ft: 15128 corp: 24/1341b lim: 100 exec/s: 89 rss: 70Mb L: 63/99 MS: 5 ChangeBit-ChangeByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:35.918 [2024-07-21 11:31:05.328524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940362859816943608 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.918 [2024-07-21 11:31:05.328555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.175 #90 NEW cov: 11822 ft: 15139 corp: 25/1362b lim: 100 exec/s: 90 rss: 70Mb L: 21/99 MS: 1 EraseBytes- 00:07:36.175 [2024-07-21 11:31:05.378628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16855260267597654505 len:59864 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.378658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.175 #91 NEW cov: 11822 ft: 15156 corp: 26/1396b lim: 100 exec/s: 91 rss: 70Mb L: 34/99 MS: 1 ChangeByte- 00:07:36.175 [2024-07-21 11:31:05.438979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.439017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.175 [2024-07-21 11:31:05.439050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.439067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.175 [2024-07-21 11:31:05.439096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.439112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.175 [2024-07-21 11:31:05.439139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17940362861266057464 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.439156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.175 #92 NEW cov: 11822 ft: 15229 corp: 27/1491b lim: 100 exec/s: 92 rss: 70Mb L: 95/99 MS: 1 ShuffleBytes- 00:07:36.175 [2024-07-21 11:31:05.508987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940370586463109112 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.509018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.175 #95 NEW cov: 11822 ft: 15263 corp: 28/1515b lim: 100 exec/s: 95 rss: 70Mb L: 24/99 MS: 3 EraseBytes-PersAutoDict-CopyPart- DE: "\377\377\377\377\377\377\377\377"- 00:07:36.175 [2024-07-21 11:31:05.569113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940365058840199160 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.175 [2024-07-21 11:31:05.569145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.433 #96 NEW cov: 11829 ft: 15285 corp: 29/1553b lim: 100 exec/s: 96 rss: 70Mb L: 38/99 MS: 1 ShuffleBytes- 00:07:36.433 [2024-07-21 11:31:05.629479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17940193977407897592 len:24416 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.433 [2024-07-21 11:31:05.629509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.433 [2024-07-21 11:31:05.629541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.433 [2024-07-21 11:31:05.629558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.433 [2024-07-21 11:31:05.629586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.433 [2024-07-21 11:31:05.629602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.433 [2024-07-21 11:31:05.629629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17940362861266057464 len:63737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.433 [2024-07-21 11:31:05.629661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.433 #97 NEW cov: 11829 ft: 15377 corp: 30/1648b lim: 100 exec/s: 48 rss: 70Mb L: 95/99 MS: 1 ChangeBinInt- 00:07:36.433 #97 DONE cov: 11829 ft: 15377 corp: 30/1648b lim: 100 exec/s: 48 rss: 70Mb 00:07:36.433 ###### Recommended dictionary. ###### 00:07:36.433 "\377\377\377\377\377\377\377\377" # Uses: 4 00:07:36.433 ###### End of recommended dictionary. ###### 00:07:36.433 Done 97 runs in 2 second(s) 00:07:36.433 11:31:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:07:36.433 11:31:05 -- ../common.sh@72 -- # (( i++ )) 00:07:36.433 11:31:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.433 11:31:05 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:07:36.433 00:07:36.433 real 1m2.924s 00:07:36.433 user 1m38.814s 00:07:36.433 sys 0m7.590s 00:07:36.433 11:31:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.433 11:31:05 -- common/autotest_common.sh@10 -- # set +x 00:07:36.433 ************************************ 00:07:36.433 END TEST nvmf_fuzz 00:07:36.433 ************************************ 00:07:36.433 11:31:05 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:36.433 11:31:05 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:36.433 11:31:05 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:36.433 11:31:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:36.433 11:31:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:36.433 11:31:05 -- common/autotest_common.sh@10 -- # set +x 00:07:36.433 ************************************ 00:07:36.433 START TEST vfio_fuzz 00:07:36.433 ************************************ 00:07:36.433 11:31:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:36.693 * Looking for test storage... 00:07:36.693 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:36.693 11:31:05 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:36.693 11:31:05 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:36.693 11:31:05 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:36.693 11:31:05 -- common/autotest_common.sh@34 -- # set -e 00:07:36.693 11:31:05 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:36.693 11:31:05 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:36.693 11:31:05 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:36.693 11:31:05 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:36.693 11:31:05 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:36.693 11:31:05 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:36.693 11:31:05 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:36.693 11:31:05 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:36.693 11:31:05 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:36.693 11:31:05 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:36.693 11:31:05 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:36.693 11:31:05 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:36.693 11:31:05 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:36.693 11:31:05 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:36.693 11:31:05 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:36.693 11:31:05 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:36.693 11:31:05 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:36.693 11:31:05 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:36.693 11:31:05 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:36.693 11:31:05 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:36.693 11:31:05 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:36.693 11:31:05 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:36.693 11:31:05 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:36.693 11:31:05 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:36.693 11:31:05 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:36.693 11:31:05 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:36.693 11:31:05 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:36.693 11:31:05 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:36.693 11:31:05 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:36.693 11:31:05 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:36.693 11:31:05 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:36.693 11:31:05 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:36.693 11:31:05 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:36.693 11:31:05 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:36.693 11:31:05 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:36.693 11:31:05 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:36.693 11:31:05 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:36.693 11:31:05 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:36.693 11:31:05 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:36.693 11:31:05 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:36.693 11:31:05 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:36.693 11:31:05 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:36.693 11:31:05 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:36.693 11:31:05 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:36.693 11:31:05 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:36.693 11:31:05 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:36.693 11:31:05 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:36.693 11:31:05 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:36.693 11:31:05 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:36.693 11:31:05 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:36.693 11:31:05 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:36.693 11:31:05 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:36.693 11:31:05 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:36.693 11:31:05 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:36.693 11:31:05 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:36.693 11:31:05 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:36.693 11:31:05 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:36.693 11:31:05 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:36.693 11:31:05 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:36.693 11:31:05 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:36.693 11:31:05 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:36.693 11:31:05 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:36.693 11:31:05 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:36.693 11:31:05 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:36.693 11:31:05 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:36.693 11:31:05 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:36.693 11:31:05 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:36.693 11:31:05 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:36.693 11:31:05 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:36.693 11:31:05 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:36.693 11:31:05 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:36.693 11:31:05 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:36.693 11:31:05 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:36.693 11:31:05 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:36.693 11:31:05 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:36.693 11:31:05 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:36.693 11:31:05 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:36.693 11:31:05 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:36.693 11:31:05 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:36.693 11:31:05 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:36.693 11:31:05 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:36.693 11:31:05 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:36.693 11:31:05 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:36.693 11:31:05 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:36.693 11:31:05 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:36.693 11:31:05 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:36.693 11:31:05 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:36.693 11:31:05 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:36.693 11:31:05 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:36.693 11:31:05 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:36.693 11:31:05 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:36.693 11:31:05 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:36.693 11:31:05 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:36.693 11:31:05 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:36.693 11:31:05 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:36.693 11:31:05 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:36.693 11:31:05 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:36.693 11:31:05 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:36.693 11:31:05 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:36.693 #define SPDK_CONFIG_H 00:07:36.693 #define SPDK_CONFIG_APPS 1 00:07:36.693 #define SPDK_CONFIG_ARCH native 00:07:36.693 #undef SPDK_CONFIG_ASAN 00:07:36.693 #undef SPDK_CONFIG_AVAHI 00:07:36.693 #undef SPDK_CONFIG_CET 00:07:36.693 #define SPDK_CONFIG_COVERAGE 1 00:07:36.693 #define SPDK_CONFIG_CROSS_PREFIX 00:07:36.693 #undef SPDK_CONFIG_CRYPTO 00:07:36.693 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:36.693 #undef SPDK_CONFIG_CUSTOMOCF 00:07:36.693 #undef SPDK_CONFIG_DAOS 00:07:36.693 #define SPDK_CONFIG_DAOS_DIR 00:07:36.693 #define SPDK_CONFIG_DEBUG 1 00:07:36.693 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:36.693 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:36.693 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:36.693 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:36.693 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:36.693 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:36.693 #define SPDK_CONFIG_EXAMPLES 1 00:07:36.693 #undef SPDK_CONFIG_FC 00:07:36.693 #define SPDK_CONFIG_FC_PATH 00:07:36.693 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:36.693 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:36.693 #undef SPDK_CONFIG_FUSE 00:07:36.693 #define SPDK_CONFIG_FUZZER 1 00:07:36.693 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:36.693 #undef SPDK_CONFIG_GOLANG 00:07:36.693 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:36.693 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:36.693 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:36.693 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:36.693 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:36.693 #define SPDK_CONFIG_IDXD 1 00:07:36.693 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:36.693 #undef SPDK_CONFIG_IPSEC_MB 00:07:36.693 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:36.693 #define SPDK_CONFIG_ISAL 1 00:07:36.693 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:36.693 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:36.693 #define SPDK_CONFIG_LIBDIR 00:07:36.693 #undef SPDK_CONFIG_LTO 00:07:36.693 #define SPDK_CONFIG_MAX_LCORES 00:07:36.693 #define SPDK_CONFIG_NVME_CUSE 1 00:07:36.693 #undef SPDK_CONFIG_OCF 00:07:36.693 #define SPDK_CONFIG_OCF_PATH 00:07:36.693 #define SPDK_CONFIG_OPENSSL_PATH 00:07:36.693 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:36.693 #undef SPDK_CONFIG_PGO_USE 00:07:36.694 #define SPDK_CONFIG_PREFIX /usr/local 00:07:36.694 #undef SPDK_CONFIG_RAID5F 00:07:36.694 #undef SPDK_CONFIG_RBD 00:07:36.694 #define SPDK_CONFIG_RDMA 1 00:07:36.694 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:36.694 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:36.694 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:36.694 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:36.694 #undef SPDK_CONFIG_SHARED 00:07:36.694 #undef SPDK_CONFIG_SMA 00:07:36.694 #define SPDK_CONFIG_TESTS 1 00:07:36.694 #undef SPDK_CONFIG_TSAN 00:07:36.694 #define SPDK_CONFIG_UBLK 1 00:07:36.694 #define SPDK_CONFIG_UBSAN 1 00:07:36.694 #undef SPDK_CONFIG_UNIT_TESTS 00:07:36.694 #undef SPDK_CONFIG_URING 00:07:36.694 #define SPDK_CONFIG_URING_PATH 00:07:36.694 #undef SPDK_CONFIG_URING_ZNS 00:07:36.694 #undef SPDK_CONFIG_USDT 00:07:36.694 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:36.694 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:36.694 #define SPDK_CONFIG_VFIO_USER 1 00:07:36.694 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:36.694 #define SPDK_CONFIG_VHOST 1 00:07:36.694 #define SPDK_CONFIG_VIRTIO 1 00:07:36.694 #undef SPDK_CONFIG_VTUNE 00:07:36.694 #define SPDK_CONFIG_VTUNE_DIR 00:07:36.694 #define SPDK_CONFIG_WERROR 1 00:07:36.694 #define SPDK_CONFIG_WPDK_DIR 00:07:36.694 #undef SPDK_CONFIG_XNVME 00:07:36.694 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:36.694 11:31:05 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:36.694 11:31:05 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:36.694 11:31:05 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:36.694 11:31:05 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:36.694 11:31:05 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:36.694 11:31:05 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.694 11:31:05 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.694 11:31:05 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.694 11:31:05 -- paths/export.sh@5 -- # export PATH 00:07:36.694 11:31:05 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.694 11:31:05 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:36.694 11:31:05 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:36.694 11:31:05 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:36.694 11:31:06 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:36.694 11:31:06 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:36.694 11:31:06 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:36.694 11:31:06 -- pm/common@16 -- # TEST_TAG=N/A 00:07:36.694 11:31:06 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:36.694 11:31:06 -- common/autotest_common.sh@52 -- # : 1 00:07:36.694 11:31:06 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:36.694 11:31:06 -- common/autotest_common.sh@56 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:36.694 11:31:06 -- common/autotest_common.sh@58 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:36.694 11:31:06 -- common/autotest_common.sh@60 -- # : 1 00:07:36.694 11:31:06 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:36.694 11:31:06 -- common/autotest_common.sh@62 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:36.694 11:31:06 -- common/autotest_common.sh@64 -- # : 00:07:36.694 11:31:06 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:36.694 11:31:06 -- common/autotest_common.sh@66 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:36.694 11:31:06 -- common/autotest_common.sh@68 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:36.694 11:31:06 -- common/autotest_common.sh@70 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:36.694 11:31:06 -- common/autotest_common.sh@72 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:36.694 11:31:06 -- common/autotest_common.sh@74 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:36.694 11:31:06 -- common/autotest_common.sh@76 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:36.694 11:31:06 -- common/autotest_common.sh@78 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:36.694 11:31:06 -- common/autotest_common.sh@80 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:36.694 11:31:06 -- common/autotest_common.sh@82 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:36.694 11:31:06 -- common/autotest_common.sh@84 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:36.694 11:31:06 -- common/autotest_common.sh@86 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:36.694 11:31:06 -- common/autotest_common.sh@88 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:36.694 11:31:06 -- common/autotest_common.sh@90 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:36.694 11:31:06 -- common/autotest_common.sh@92 -- # : 1 00:07:36.694 11:31:06 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:36.694 11:31:06 -- common/autotest_common.sh@94 -- # : 1 00:07:36.694 11:31:06 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:36.694 11:31:06 -- common/autotest_common.sh@96 -- # : rdma 00:07:36.694 11:31:06 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:36.694 11:31:06 -- common/autotest_common.sh@98 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:36.694 11:31:06 -- common/autotest_common.sh@100 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:36.694 11:31:06 -- common/autotest_common.sh@102 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:36.694 11:31:06 -- common/autotest_common.sh@104 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:36.694 11:31:06 -- common/autotest_common.sh@106 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:36.694 11:31:06 -- common/autotest_common.sh@108 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:36.694 11:31:06 -- common/autotest_common.sh@110 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:36.694 11:31:06 -- common/autotest_common.sh@112 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:36.694 11:31:06 -- common/autotest_common.sh@114 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:36.694 11:31:06 -- common/autotest_common.sh@116 -- # : 1 00:07:36.694 11:31:06 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:36.694 11:31:06 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:36.694 11:31:06 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:36.694 11:31:06 -- common/autotest_common.sh@120 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:36.694 11:31:06 -- common/autotest_common.sh@122 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:36.694 11:31:06 -- common/autotest_common.sh@124 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:36.694 11:31:06 -- common/autotest_common.sh@126 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:36.694 11:31:06 -- common/autotest_common.sh@128 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:36.694 11:31:06 -- common/autotest_common.sh@130 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:36.694 11:31:06 -- common/autotest_common.sh@132 -- # : v23.11 00:07:36.694 11:31:06 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:36.694 11:31:06 -- common/autotest_common.sh@134 -- # : true 00:07:36.694 11:31:06 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:36.694 11:31:06 -- common/autotest_common.sh@136 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:36.694 11:31:06 -- common/autotest_common.sh@138 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:36.694 11:31:06 -- common/autotest_common.sh@140 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:36.694 11:31:06 -- common/autotest_common.sh@142 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:36.694 11:31:06 -- common/autotest_common.sh@144 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:36.694 11:31:06 -- common/autotest_common.sh@146 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:36.694 11:31:06 -- common/autotest_common.sh@148 -- # : 00:07:36.694 11:31:06 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:36.694 11:31:06 -- common/autotest_common.sh@150 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:36.694 11:31:06 -- common/autotest_common.sh@152 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:36.694 11:31:06 -- common/autotest_common.sh@154 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:36.694 11:31:06 -- common/autotest_common.sh@156 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:36.694 11:31:06 -- common/autotest_common.sh@158 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:36.694 11:31:06 -- common/autotest_common.sh@160 -- # : 0 00:07:36.694 11:31:06 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:36.694 11:31:06 -- common/autotest_common.sh@163 -- # : 00:07:36.694 11:31:06 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:36.694 11:31:06 -- common/autotest_common.sh@165 -- # : 0 00:07:36.695 11:31:06 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:36.695 11:31:06 -- common/autotest_common.sh@167 -- # : 0 00:07:36.695 11:31:06 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:36.695 11:31:06 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:36.695 11:31:06 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:36.695 11:31:06 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:36.695 11:31:06 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:36.695 11:31:06 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:36.695 11:31:06 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:36.695 11:31:06 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:36.695 11:31:06 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:36.695 11:31:06 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:36.695 11:31:06 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:36.695 11:31:06 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:36.695 11:31:06 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:36.695 11:31:06 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:36.695 11:31:06 -- common/autotest_common.sh@196 -- # cat 00:07:36.695 11:31:06 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:36.695 11:31:06 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:36.695 11:31:06 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:36.695 11:31:06 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:36.695 11:31:06 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:36.695 11:31:06 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:36.695 11:31:06 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:36.695 11:31:06 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:36.695 11:31:06 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:36.695 11:31:06 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:36.695 11:31:06 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:36.695 11:31:06 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:36.695 11:31:06 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:36.695 11:31:06 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:36.695 11:31:06 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:36.695 11:31:06 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:36.695 11:31:06 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:36.695 11:31:06 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:36.695 11:31:06 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:36.695 11:31:06 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:36.695 11:31:06 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:36.695 11:31:06 -- common/autotest_common.sh@249 -- # valgrind= 00:07:36.695 11:31:06 -- common/autotest_common.sh@255 -- # uname -s 00:07:36.695 11:31:06 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:36.695 11:31:06 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:36.695 11:31:06 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:36.695 11:31:06 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:36.695 11:31:06 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:36.695 11:31:06 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:36.695 11:31:06 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:36.695 11:31:06 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:36.695 11:31:06 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:36.695 11:31:06 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:36.695 11:31:06 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:36.695 11:31:06 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:36.695 11:31:06 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:36.695 11:31:06 -- common/autotest_common.sh@309 -- # [[ -z 2070347 ]] 00:07:36.695 11:31:06 -- common/autotest_common.sh@309 -- # kill -0 2070347 00:07:36.695 11:31:06 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:36.695 11:31:06 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:36.695 11:31:06 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:36.695 11:31:06 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:36.695 11:31:06 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:36.695 11:31:06 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:36.695 11:31:06 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:36.695 11:31:06 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:36.695 11:31:06 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.Jl7dGS 00:07:36.695 11:31:06 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:36.695 11:31:06 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:36.695 11:31:06 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:36.695 11:31:06 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.Jl7dGS/tests/vfio /tmp/spdk.Jl7dGS 00:07:36.695 11:31:06 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@318 -- # df -T 00:07:36.695 11:31:06 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:36.695 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:36.695 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=47407124480 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:07:36.695 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=14335193088 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:36.695 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342489088 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:07:36.695 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=5976064 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868680704 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:36.695 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=2478080 00:07:36.695 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:36.695 11:31:06 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:36.696 11:31:06 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:36.696 11:31:06 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:36.696 11:31:06 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:36.696 11:31:06 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:36.696 * Looking for test storage... 00:07:36.696 11:31:06 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:36.696 11:31:06 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:36.696 11:31:06 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:36.696 11:31:06 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:36.696 11:31:06 -- common/autotest_common.sh@363 -- # mount=/ 00:07:36.696 11:31:06 -- common/autotest_common.sh@365 -- # target_space=47407124480 00:07:36.696 11:31:06 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:36.696 11:31:06 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:36.696 11:31:06 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:36.696 11:31:06 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:36.696 11:31:06 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:36.696 11:31:06 -- common/autotest_common.sh@372 -- # new_size=16549785600 00:07:36.696 11:31:06 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:36.696 11:31:06 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:36.696 11:31:06 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:36.696 11:31:06 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:36.696 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:36.696 11:31:06 -- common/autotest_common.sh@380 -- # return 0 00:07:36.696 11:31:06 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:36.696 11:31:06 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:36.696 11:31:06 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:36.696 11:31:06 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:36.696 11:31:06 -- common/autotest_common.sh@1672 -- # true 00:07:36.696 11:31:06 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:36.696 11:31:06 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:36.696 11:31:06 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:36.696 11:31:06 -- common/autotest_common.sh@27 -- # exec 00:07:36.696 11:31:06 -- common/autotest_common.sh@29 -- # exec 00:07:36.696 11:31:06 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:36.696 11:31:06 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:36.696 11:31:06 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:36.696 11:31:06 -- common/autotest_common.sh@18 -- # set -x 00:07:36.696 11:31:06 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:07:36.696 11:31:06 -- ../common.sh@8 -- # pids=() 00:07:36.696 11:31:06 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:36.696 11:31:06 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:36.954 11:31:06 -- vfio/run.sh@59 -- # fuzz_num=7 00:07:36.954 11:31:06 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:07:36.954 11:31:06 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:07:36.954 11:31:06 -- vfio/run.sh@65 -- # mem_size=0 00:07:36.954 11:31:06 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:07:36.954 11:31:06 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:07:36.954 11:31:06 -- ../common.sh@69 -- # local fuzz_num=7 00:07:36.954 11:31:06 -- ../common.sh@70 -- # local time=1 00:07:36.954 11:31:06 -- ../common.sh@72 -- # (( i = 0 )) 00:07:36.954 11:31:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.954 11:31:06 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:36.954 11:31:06 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:07:36.954 11:31:06 -- vfio/run.sh@23 -- # local timen=1 00:07:36.954 11:31:06 -- vfio/run.sh@24 -- # local core=0x1 00:07:36.954 11:31:06 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:36.954 11:31:06 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:07:36.954 11:31:06 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:07:36.954 11:31:06 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:07:36.954 11:31:06 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:07:36.954 11:31:06 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:36.954 11:31:06 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:07:36.954 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:36.954 11:31:06 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:07:36.954 [2024-07-21 11:31:06.156407] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:36.954 [2024-07-21 11:31:06.156498] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2070508 ] 00:07:36.954 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.954 [2024-07-21 11:31:06.230696] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.954 [2024-07-21 11:31:06.268116] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.954 [2024-07-21 11:31:06.268264] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.212 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.212 INFO: Seed: 4052264714 00:07:37.212 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:37.212 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:37.212 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:37.212 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.212 #2 INITED exec/s: 0 rss: 61Mb 00:07:37.212 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.212 This may also happen if the target rejected all inputs we tried so far 00:07:37.470 NEW_FUNC[1/631]: 0x49e0e0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:07:37.470 NEW_FUNC[2/631]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:37.470 #9 NEW cov: 10707 ft: 10500 corp: 2/7b lim: 60 exec/s: 0 rss: 66Mb L: 6/6 MS: 2 CopyPart-CMP- DE: "\377\377\377\027"- 00:07:37.728 #10 NEW cov: 10721 ft: 13084 corp: 3/65b lim: 60 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:07:37.728 #11 NEW cov: 10724 ft: 13989 corp: 4/72b lim: 60 exec/s: 0 rss: 69Mb L: 7/58 MS: 1 CrossOver- 00:07:38.020 #12 NEW cov: 10724 ft: 15501 corp: 5/103b lim: 60 exec/s: 0 rss: 69Mb L: 31/58 MS: 1 CrossOver- 00:07:38.020 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.020 #13 NEW cov: 10741 ft: 15594 corp: 6/122b lim: 60 exec/s: 0 rss: 69Mb L: 19/58 MS: 1 InsertRepeatedBytes- 00:07:38.278 #14 NEW cov: 10741 ft: 16273 corp: 7/129b lim: 60 exec/s: 0 rss: 69Mb L: 7/58 MS: 1 ChangeBit- 00:07:38.278 #17 NEW cov: 10741 ft: 16339 corp: 8/148b lim: 60 exec/s: 17 rss: 69Mb L: 19/58 MS: 3 EraseBytes-InsertByte-CrossOver- 00:07:38.278 #18 NEW cov: 10741 ft: 16734 corp: 9/206b lim: 60 exec/s: 18 rss: 69Mb L: 58/58 MS: 1 CrossOver- 00:07:38.536 #19 NEW cov: 10741 ft: 16747 corp: 10/223b lim: 60 exec/s: 19 rss: 69Mb L: 17/58 MS: 1 EraseBytes- 00:07:38.536 #20 NEW cov: 10741 ft: 16870 corp: 11/278b lim: 60 exec/s: 20 rss: 70Mb L: 55/58 MS: 1 CrossOver- 00:07:38.794 #21 NEW cov: 10741 ft: 16912 corp: 12/285b lim: 60 exec/s: 21 rss: 70Mb L: 7/58 MS: 1 InsertByte- 00:07:38.794 #22 NEW cov: 10741 ft: 16968 corp: 13/299b lim: 60 exec/s: 22 rss: 70Mb L: 14/58 MS: 1 CrossOver- 00:07:39.053 #23 NEW cov: 10741 ft: 17016 corp: 14/306b lim: 60 exec/s: 23 rss: 70Mb L: 7/58 MS: 1 CopyPart- 00:07:39.053 #24 NEW cov: 10748 ft: 17280 corp: 15/313b lim: 60 exec/s: 24 rss: 70Mb L: 7/58 MS: 1 ChangeBinInt- 00:07:39.311 #25 NEW cov: 10748 ft: 17424 corp: 16/373b lim: 60 exec/s: 12 rss: 70Mb L: 60/60 MS: 1 InsertRepeatedBytes- 00:07:39.311 #25 DONE cov: 10748 ft: 17424 corp: 16/373b lim: 60 exec/s: 12 rss: 70Mb 00:07:39.311 ###### Recommended dictionary. ###### 00:07:39.311 "\377\377\377\027" # Uses: 0 00:07:39.311 ###### End of recommended dictionary. ###### 00:07:39.311 Done 25 runs in 2 second(s) 00:07:39.570 11:31:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:07:39.570 11:31:08 -- ../common.sh@72 -- # (( i++ )) 00:07:39.570 11:31:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.570 11:31:08 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:39.570 11:31:08 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:07:39.570 11:31:08 -- vfio/run.sh@23 -- # local timen=1 00:07:39.570 11:31:08 -- vfio/run.sh@24 -- # local core=0x1 00:07:39.570 11:31:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:39.570 11:31:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:07:39.570 11:31:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:07:39.570 11:31:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:07:39.570 11:31:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:07:39.570 11:31:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:39.570 11:31:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:07:39.570 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:39.570 11:31:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:07:39.570 [2024-07-21 11:31:08.781218] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:39.570 [2024-07-21 11:31:08.781292] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2070913 ] 00:07:39.570 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.570 [2024-07-21 11:31:08.853719] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.570 [2024-07-21 11:31:08.889195] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.570 [2024-07-21 11:31:08.889344] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.829 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.829 INFO: Seed: 2377309702 00:07:39.829 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:39.829 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:39.829 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:39.829 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.829 #2 INITED exec/s: 0 rss: 61Mb 00:07:39.829 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.829 This may also happen if the target rejected all inputs we tried so far 00:07:39.829 [2024-07-21 11:31:09.192508] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:39.829 [2024-07-21 11:31:09.192541] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:39.829 [2024-07-21 11:31:09.192560] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:40.346 NEW_FUNC[1/637]: 0x49e680 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:07:40.346 NEW_FUNC[2/637]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:40.346 #22 NEW cov: 10722 ft: 10239 corp: 2/32b lim: 40 exec/s: 0 rss: 66Mb L: 31/31 MS: 5 InsertByte-EraseBytes-ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:40.346 [2024-07-21 11:31:09.669188] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:40.346 [2024-07-21 11:31:09.669223] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:40.346 [2024-07-21 11:31:09.669242] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:40.604 NEW_FUNC[1/1]: 0x1c968f0 in msg_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:808 00:07:40.604 #23 NEW cov: 10745 ft: 12781 corp: 3/72b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CopyPart- 00:07:40.604 [2024-07-21 11:31:09.867882] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:40.604 [2024-07-21 11:31:09.867910] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:40.604 [2024-07-21 11:31:09.867928] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:40.604 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.604 #24 NEW cov: 10762 ft: 14800 corp: 4/112b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:07:40.863 [2024-07-21 11:31:10.063835] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:40.863 [2024-07-21 11:31:10.063861] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:40.863 [2024-07-21 11:31:10.063878] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:40.863 #30 NEW cov: 10762 ft: 15023 corp: 5/140b lim: 40 exec/s: 30 rss: 68Mb L: 28/40 MS: 1 InsertRepeatedBytes- 00:07:40.863 [2024-07-21 11:31:10.270777] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:40.863 [2024-07-21 11:31:10.270801] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:40.863 [2024-07-21 11:31:10.270819] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:41.121 #31 NEW cov: 10762 ft: 15312 corp: 6/177b lim: 40 exec/s: 31 rss: 68Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:07:41.121 [2024-07-21 11:31:10.466499] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:41.121 [2024-07-21 11:31:10.466524] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:41.121 [2024-07-21 11:31:10.466541] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:41.380 #32 NEW cov: 10762 ft: 15550 corp: 7/204b lim: 40 exec/s: 32 rss: 68Mb L: 27/40 MS: 1 InsertRepeatedBytes- 00:07:41.380 [2024-07-21 11:31:10.662156] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:41.380 [2024-07-21 11:31:10.662180] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:41.380 [2024-07-21 11:31:10.662198] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:41.380 #33 NEW cov: 10762 ft: 16239 corp: 8/233b lim: 40 exec/s: 33 rss: 68Mb L: 29/40 MS: 1 EraseBytes- 00:07:41.639 [2024-07-21 11:31:10.859878] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:41.639 [2024-07-21 11:31:10.859902] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:41.639 [2024-07-21 11:31:10.859920] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:41.639 #34 NEW cov: 10769 ft: 16320 corp: 9/273b lim: 40 exec/s: 34 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:07:41.639 [2024-07-21 11:31:11.057755] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:41.639 [2024-07-21 11:31:11.057778] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:41.639 [2024-07-21 11:31:11.057796] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:41.898 #35 NEW cov: 10769 ft: 16392 corp: 10/304b lim: 40 exec/s: 17 rss: 68Mb L: 31/40 MS: 1 ChangeBinInt- 00:07:41.898 #35 DONE cov: 10769 ft: 16392 corp: 10/304b lim: 40 exec/s: 17 rss: 68Mb 00:07:41.898 Done 35 runs in 2 second(s) 00:07:42.158 11:31:11 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:07:42.158 11:31:11 -- ../common.sh@72 -- # (( i++ )) 00:07:42.158 11:31:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.158 11:31:11 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:42.158 11:31:11 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:07:42.158 11:31:11 -- vfio/run.sh@23 -- # local timen=1 00:07:42.158 11:31:11 -- vfio/run.sh@24 -- # local core=0x1 00:07:42.158 11:31:11 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:42.158 11:31:11 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:07:42.158 11:31:11 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:07:42.158 11:31:11 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:07:42.158 11:31:11 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:07:42.158 11:31:11 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:42.158 11:31:11 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:07:42.158 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:42.158 11:31:11 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:07:42.158 [2024-07-21 11:31:11.465606] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:42.158 [2024-07-21 11:31:11.465695] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2071460 ] 00:07:42.158 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.158 [2024-07-21 11:31:11.538566] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.158 [2024-07-21 11:31:11.573490] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.158 [2024-07-21 11:31:11.573637] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.417 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.417 INFO: Seed: 761361677 00:07:42.417 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:42.417 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:42.417 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:42.417 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.417 #2 INITED exec/s: 0 rss: 60Mb 00:07:42.417 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.417 This may also happen if the target rejected all inputs we tried so far 00:07:42.676 [2024-07-21 11:31:11.879231] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:42.676 [2024-07-21 11:31:11.879279] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:42.935 NEW_FUNC[1/638]: 0x49f060 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:07:42.935 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:42.936 #6 NEW cov: 10723 ft: 10653 corp: 2/49b lim: 80 exec/s: 0 rss: 67Mb L: 48/48 MS: 4 ChangeByte-CopyPart-CrossOver-InsertRepeatedBytes- 00:07:43.199 [2024-07-21 11:31:12.412021] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:43.199 [2024-07-21 11:31:12.412060] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:43.199 #7 NEW cov: 10737 ft: 13740 corp: 3/97b lim: 80 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 CopyPart- 00:07:43.199 [2024-07-21 11:31:12.608166] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:43.199 [2024-07-21 11:31:12.608196] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:43.458 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.458 #12 NEW cov: 10754 ft: 14975 corp: 4/146b lim: 80 exec/s: 0 rss: 69Mb L: 49/49 MS: 5 ChangeBit-ChangeByte-InsertByte-EraseBytes-CrossOver- 00:07:43.458 [2024-07-21 11:31:12.824619] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:43.458 [2024-07-21 11:31:12.824649] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:43.718 #13 NEW cov: 10754 ft: 15485 corp: 5/194b lim: 80 exec/s: 13 rss: 69Mb L: 48/49 MS: 1 ShuffleBytes- 00:07:43.718 [2024-07-21 11:31:13.028179] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:43.718 [2024-07-21 11:31:13.028213] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:43.718 #14 NEW cov: 10754 ft: 15836 corp: 6/244b lim: 80 exec/s: 14 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:07:43.978 [2024-07-21 11:31:13.229631] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:43.978 #15 NEW cov: 10755 ft: 16075 corp: 7/252b lim: 80 exec/s: 15 rss: 69Mb L: 8/50 MS: 1 InsertRepeatedBytes- 00:07:44.236 [2024-07-21 11:31:13.428269] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:44.236 #17 NEW cov: 10755 ft: 16447 corp: 8/298b lim: 80 exec/s: 17 rss: 69Mb L: 46/50 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:44.236 [2024-07-21 11:31:13.636117] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:44.236 [2024-07-21 11:31:13.636150] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:44.494 #18 NEW cov: 10762 ft: 16601 corp: 9/346b lim: 80 exec/s: 18 rss: 69Mb L: 48/50 MS: 1 ChangeByte- 00:07:44.494 [2024-07-21 11:31:13.834516] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:07:44.494 [2024-07-21 11:31:13.834546] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:44.752 #19 NEW cov: 10762 ft: 16896 corp: 10/394b lim: 80 exec/s: 9 rss: 69Mb L: 48/50 MS: 1 ShuffleBytes- 00:07:44.752 #19 DONE cov: 10762 ft: 16896 corp: 10/394b lim: 80 exec/s: 9 rss: 69Mb 00:07:44.752 Done 19 runs in 2 second(s) 00:07:45.011 11:31:14 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:07:45.011 11:31:14 -- ../common.sh@72 -- # (( i++ )) 00:07:45.011 11:31:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.011 11:31:14 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:45.011 11:31:14 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:07:45.011 11:31:14 -- vfio/run.sh@23 -- # local timen=1 00:07:45.011 11:31:14 -- vfio/run.sh@24 -- # local core=0x1 00:07:45.011 11:31:14 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:45.011 11:31:14 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:07:45.011 11:31:14 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:07:45.011 11:31:14 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:07:45.011 11:31:14 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:07:45.011 11:31:14 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:45.011 11:31:14 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:07:45.011 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:45.011 11:31:14 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:07:45.011 [2024-07-21 11:31:14.252148] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:45.011 [2024-07-21 11:31:14.252221] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2072004 ] 00:07:45.011 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.011 [2024-07-21 11:31:14.324476] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.011 [2024-07-21 11:31:14.359530] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.011 [2024-07-21 11:31:14.359676] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.270 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.270 INFO: Seed: 3545360751 00:07:45.270 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:45.270 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:45.270 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:45.270 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.270 #2 INITED exec/s: 0 rss: 61Mb 00:07:45.270 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.270 This may also happen if the target rejected all inputs we tried so far 00:07:45.786 NEW_FUNC[1/622]: 0x49f740 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:07:45.786 NEW_FUNC[2/622]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:45.786 #17 NEW cov: 10579 ft: 10639 corp: 2/34b lim: 320 exec/s: 0 rss: 67Mb L: 33/33 MS: 5 CrossOver-ChangeByte-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:07:46.045 NEW_FUNC[1/10]: 0x10f6570 in spdk_nvmf_request_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:4474 00:07:46.045 NEW_FUNC[2/10]: 0x10f6930 in spdk_thread_exec_msg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/thread.h:546 00:07:46.045 #18 NEW cov: 10713 ft: 13346 corp: 3/67b lim: 320 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeByte- 00:07:46.303 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.303 #19 NEW cov: 10730 ft: 14569 corp: 4/100b lim: 320 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 CMP- DE: "\010\000\000\000"- 00:07:46.303 #25 NEW cov: 10730 ft: 14712 corp: 5/164b lim: 320 exec/s: 25 rss: 69Mb L: 64/64 MS: 1 CopyPart- 00:07:46.561 #26 NEW cov: 10730 ft: 15343 corp: 6/197b lim: 320 exec/s: 26 rss: 69Mb L: 33/64 MS: 1 ChangeBit- 00:07:46.819 #27 NEW cov: 10730 ft: 15739 corp: 7/230b lim: 320 exec/s: 27 rss: 69Mb L: 33/64 MS: 1 PersAutoDict- DE: "\010\000\000\000"- 00:07:47.076 #28 NEW cov: 10730 ft: 15979 corp: 8/287b lim: 320 exec/s: 28 rss: 69Mb L: 57/64 MS: 1 InsertRepeatedBytes- 00:07:47.334 #29 NEW cov: 10737 ft: 16094 corp: 9/412b lim: 320 exec/s: 29 rss: 69Mb L: 125/125 MS: 1 InsertRepeatedBytes- 00:07:47.334 #30 NEW cov: 10737 ft: 16278 corp: 10/469b lim: 320 exec/s: 15 rss: 70Mb L: 57/125 MS: 1 ChangeByte- 00:07:47.334 #30 DONE cov: 10737 ft: 16278 corp: 10/469b lim: 320 exec/s: 15 rss: 70Mb 00:07:47.334 ###### Recommended dictionary. ###### 00:07:47.334 "\010\000\000\000" # Uses: 2 00:07:47.334 ###### End of recommended dictionary. ###### 00:07:47.334 Done 30 runs in 2 second(s) 00:07:47.592 11:31:16 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:07:47.592 11:31:16 -- ../common.sh@72 -- # (( i++ )) 00:07:47.592 11:31:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.592 11:31:17 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:47.592 11:31:17 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:07:47.592 11:31:17 -- vfio/run.sh@23 -- # local timen=1 00:07:47.592 11:31:17 -- vfio/run.sh@24 -- # local core=0x1 00:07:47.592 11:31:17 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:47.592 11:31:17 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:07:47.592 11:31:17 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:07:47.592 11:31:17 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:07:47.592 11:31:17 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:07:47.592 11:31:17 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:47.592 11:31:17 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:07:47.592 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:47.592 11:31:17 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:07:47.850 [2024-07-21 11:31:17.043060] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:47.850 [2024-07-21 11:31:17.043161] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2072474 ] 00:07:47.850 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.850 [2024-07-21 11:31:17.119875] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.850 [2024-07-21 11:31:17.156357] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.850 [2024-07-21 11:31:17.156506] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.108 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.108 INFO: Seed: 2049419149 00:07:48.108 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:48.108 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:48.108 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:48.108 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.108 #2 INITED exec/s: 0 rss: 60Mb 00:07:48.108 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.108 This may also happen if the target rejected all inputs we tried so far 00:07:48.624 NEW_FUNC[1/632]: 0x49ffc0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:07:48.624 NEW_FUNC[2/632]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:48.624 #7 NEW cov: 10702 ft: 10426 corp: 2/92b lim: 320 exec/s: 0 rss: 67Mb L: 91/91 MS: 5 CopyPart-CopyPart-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:48.624 #27 NEW cov: 10716 ft: 13193 corp: 3/188b lim: 320 exec/s: 0 rss: 68Mb L: 96/96 MS: 5 CMP-InsertByte-ChangeByte-EraseBytes-CrossOver- DE: "\377\372.\031\000 \000\000"- 00:07:48.890 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.890 #33 NEW cov: 10733 ft: 14424 corp: 4/284b lim: 320 exec/s: 0 rss: 69Mb L: 96/96 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:49.148 #34 NEW cov: 10736 ft: 15413 corp: 5/384b lim: 320 exec/s: 34 rss: 69Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:07:49.407 #35 NEW cov: 10736 ft: 15748 corp: 6/457b lim: 320 exec/s: 35 rss: 69Mb L: 73/100 MS: 1 EraseBytes- 00:07:49.665 #36 NEW cov: 10736 ft: 16125 corp: 7/548b lim: 320 exec/s: 36 rss: 69Mb L: 91/100 MS: 1 ChangeBit- 00:07:49.665 #37 NEW cov: 10736 ft: 16448 corp: 8/639b lim: 320 exec/s: 37 rss: 69Mb L: 91/100 MS: 1 CopyPart- 00:07:49.924 #43 NEW cov: 10743 ft: 16521 corp: 9/800b lim: 320 exec/s: 43 rss: 69Mb L: 161/161 MS: 1 InsertRepeatedBytes- 00:07:50.182 #44 NEW cov: 10743 ft: 16664 corp: 10/1047b lim: 320 exec/s: 22 rss: 69Mb L: 247/247 MS: 1 InsertRepeatedBytes- 00:07:50.182 #44 DONE cov: 10743 ft: 16664 corp: 10/1047b lim: 320 exec/s: 22 rss: 69Mb 00:07:50.182 ###### Recommended dictionary. ###### 00:07:50.182 "\377\372.\031\000 \000\000" # Uses: 3 00:07:50.182 "\000\000\000\000" # Uses: 0 00:07:50.182 ###### End of recommended dictionary. ###### 00:07:50.182 Done 44 runs in 2 second(s) 00:07:50.440 11:31:19 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:07:50.440 11:31:19 -- ../common.sh@72 -- # (( i++ )) 00:07:50.440 11:31:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.440 11:31:19 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:50.440 11:31:19 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:07:50.440 11:31:19 -- vfio/run.sh@23 -- # local timen=1 00:07:50.440 11:31:19 -- vfio/run.sh@24 -- # local core=0x1 00:07:50.440 11:31:19 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:50.440 11:31:19 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:07:50.440 11:31:19 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:07:50.440 11:31:19 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:07:50.440 11:31:19 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:07:50.440 11:31:19 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:50.440 11:31:19 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:07:50.440 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:50.440 11:31:19 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:07:50.440 [2024-07-21 11:31:19.726238] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:50.440 [2024-07-21 11:31:19.726312] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2072844 ] 00:07:50.440 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.440 [2024-07-21 11:31:19.800614] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.440 [2024-07-21 11:31:19.837616] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.440 [2024-07-21 11:31:19.837776] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.771 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.771 INFO: Seed: 435418340 00:07:50.771 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:50.771 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:50.771 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:50.771 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.771 #2 INITED exec/s: 0 rss: 61Mb 00:07:50.771 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.772 This may also happen if the target rejected all inputs we tried so far 00:07:50.772 [2024-07-21 11:31:20.118494] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:50.772 [2024-07-21 11:31:20.118547] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:51.311 NEW_FUNC[1/638]: 0x4a09c0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:07:51.311 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:51.311 #23 NEW cov: 10729 ft: 10246 corp: 2/45b lim: 120 exec/s: 0 rss: 66Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:07:51.311 [2024-07-21 11:31:20.620842] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:51.311 [2024-07-21 11:31:20.620891] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:51.569 #29 NEW cov: 10743 ft: 13296 corp: 3/89b lim: 120 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 ChangeBit- 00:07:51.569 [2024-07-21 11:31:20.825831] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:51.569 [2024-07-21 11:31:20.825864] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:51.569 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.569 #32 NEW cov: 10760 ft: 13947 corp: 4/170b lim: 120 exec/s: 0 rss: 69Mb L: 81/81 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:51.827 [2024-07-21 11:31:21.041351] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:51.827 [2024-07-21 11:31:21.041384] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:51.827 #34 NEW cov: 10760 ft: 14960 corp: 5/252b lim: 120 exec/s: 34 rss: 69Mb L: 82/82 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:51.827 [2024-07-21 11:31:21.245236] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:51.827 [2024-07-21 11:31:21.245268] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:52.085 #35 NEW cov: 10760 ft: 15469 corp: 6/308b lim: 120 exec/s: 35 rss: 69Mb L: 56/82 MS: 1 InsertRepeatedBytes- 00:07:52.085 [2024-07-21 11:31:21.452322] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:52.085 [2024-07-21 11:31:21.452355] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:52.344 #36 NEW cov: 10760 ft: 15573 corp: 7/352b lim: 120 exec/s: 36 rss: 69Mb L: 44/82 MS: 1 ChangeByte- 00:07:52.344 [2024-07-21 11:31:21.657003] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:52.344 [2024-07-21 11:31:21.657036] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:52.602 #37 NEW cov: 10760 ft: 15773 corp: 8/462b lim: 120 exec/s: 37 rss: 69Mb L: 110/110 MS: 1 CopyPart- 00:07:52.602 [2024-07-21 11:31:21.860784] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:52.602 [2024-07-21 11:31:21.860816] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:52.602 #38 NEW cov: 10767 ft: 15998 corp: 9/573b lim: 120 exec/s: 38 rss: 69Mb L: 111/111 MS: 1 InsertRepeatedBytes- 00:07:52.861 [2024-07-21 11:31:22.065553] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:52.861 [2024-07-21 11:31:22.065584] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:52.861 #39 NEW cov: 10767 ft: 16053 corp: 10/654b lim: 120 exec/s: 19 rss: 69Mb L: 81/111 MS: 1 ChangeBit- 00:07:52.861 #39 DONE cov: 10767 ft: 16053 corp: 10/654b lim: 120 exec/s: 19 rss: 69Mb 00:07:52.861 Done 39 runs in 2 second(s) 00:07:53.119 11:31:22 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:07:53.119 11:31:22 -- ../common.sh@72 -- # (( i++ )) 00:07:53.119 11:31:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.119 11:31:22 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:53.119 11:31:22 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:07:53.119 11:31:22 -- vfio/run.sh@23 -- # local timen=1 00:07:53.119 11:31:22 -- vfio/run.sh@24 -- # local core=0x1 00:07:53.119 11:31:22 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:53.119 11:31:22 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:07:53.119 11:31:22 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:07:53.119 11:31:22 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:07:53.119 11:31:22 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:07:53.119 11:31:22 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:53.119 11:31:22 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:07:53.119 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:53.119 11:31:22 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:07:53.119 [2024-07-21 11:31:22.485527] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:53.119 [2024-07-21 11:31:22.485600] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2073391 ] 00:07:53.119 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.378 [2024-07-21 11:31:22.558435] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.378 [2024-07-21 11:31:22.593337] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.378 [2024-07-21 11:31:22.593505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.378 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.378 INFO: Seed: 3194407975 00:07:53.378 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:07:53.378 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:07:53.378 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:53.378 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.378 #2 INITED exec/s: 0 rss: 61Mb 00:07:53.378 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.378 This may also happen if the target rejected all inputs we tried so far 00:07:53.636 [2024-07-21 11:31:22.881524] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:53.636 [2024-07-21 11:31:22.881569] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:53.894 NEW_FUNC[1/635]: 0x4a16b0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:07:53.894 NEW_FUNC[2/635]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:53.894 #15 NEW cov: 10611 ft: 10689 corp: 2/34b lim: 90 exec/s: 0 rss: 67Mb L: 33/33 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:54.152 [2024-07-21 11:31:23.362798] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.152 [2024-07-21 11:31:23.362845] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.152 NEW_FUNC[1/3]: 0x1380aa0 in handle_cmd_req /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5534 00:07:54.152 NEW_FUNC[2/3]: 0x13aa490 in handle_sq_tdbl_write /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2539 00:07:54.152 #16 NEW cov: 10732 ft: 13350 corp: 3/71b lim: 90 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 CMP- DE: "\377\377\377\006"- 00:07:54.152 [2024-07-21 11:31:23.569997] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.152 [2024-07-21 11:31:23.570029] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.410 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.410 #20 NEW cov: 10752 ft: 13763 corp: 4/80b lim: 90 exec/s: 0 rss: 69Mb L: 9/37 MS: 4 ChangeByte-CopyPart-ShuffleBytes-CMP- DE: "J\025\345\003\000\000\000\000"- 00:07:54.410 [2024-07-21 11:31:23.786335] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.410 [2024-07-21 11:31:23.786367] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.668 #21 NEW cov: 10752 ft: 14657 corp: 5/120b lim: 90 exec/s: 21 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:54.668 [2024-07-21 11:31:23.992880] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.668 [2024-07-21 11:31:23.992910] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.926 #22 NEW cov: 10752 ft: 15149 corp: 6/160b lim: 90 exec/s: 22 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:54.926 [2024-07-21 11:31:24.199533] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.926 [2024-07-21 11:31:24.199564] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.926 #23 NEW cov: 10752 ft: 15581 corp: 7/193b lim: 90 exec/s: 23 rss: 69Mb L: 33/40 MS: 1 ChangeByte- 00:07:55.185 [2024-07-21 11:31:24.405159] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.185 [2024-07-21 11:31:24.405192] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.185 #24 NEW cov: 10752 ft: 15630 corp: 8/202b lim: 90 exec/s: 24 rss: 69Mb L: 9/40 MS: 1 ChangeBinInt- 00:07:55.185 [2024-07-21 11:31:24.609493] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.185 [2024-07-21 11:31:24.609524] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.444 #25 NEW cov: 10759 ft: 15951 corp: 9/235b lim: 90 exec/s: 25 rss: 69Mb L: 33/40 MS: 1 ChangeByte- 00:07:55.444 [2024-07-21 11:31:24.811852] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.444 [2024-07-21 11:31:24.811884] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.704 #31 NEW cov: 10759 ft: 16318 corp: 10/269b lim: 90 exec/s: 15 rss: 69Mb L: 34/40 MS: 1 InsertByte- 00:07:55.704 #31 DONE cov: 10759 ft: 16318 corp: 10/269b lim: 90 exec/s: 15 rss: 69Mb 00:07:55.704 ###### Recommended dictionary. ###### 00:07:55.704 "\377\377\377\006" # Uses: 1 00:07:55.704 "J\025\345\003\000\000\000\000" # Uses: 0 00:07:55.704 ###### End of recommended dictionary. ###### 00:07:55.704 Done 31 runs in 2 second(s) 00:07:55.963 11:31:25 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:07:55.963 11:31:25 -- ../common.sh@72 -- # (( i++ )) 00:07:55.963 11:31:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.963 11:31:25 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:07:55.963 00:07:55.963 real 0m19.349s 00:07:55.963 user 0m27.478s 00:07:55.963 sys 0m1.797s 00:07:55.963 11:31:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.963 11:31:25 -- common/autotest_common.sh@10 -- # set +x 00:07:55.963 ************************************ 00:07:55.963 END TEST vfio_fuzz 00:07:55.963 ************************************ 00:07:55.963 11:31:25 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:07:55.963 00:07:55.963 real 1m22.489s 00:07:55.963 user 2m6.360s 00:07:55.963 sys 0m9.559s 00:07:55.963 11:31:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.963 11:31:25 -- common/autotest_common.sh@10 -- # set +x 00:07:55.963 ************************************ 00:07:55.963 END TEST llvm_fuzz 00:07:55.963 ************************************ 00:07:55.963 11:31:25 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:07:55.963 11:31:25 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:07:55.963 11:31:25 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:07:55.963 11:31:25 -- common/autotest_common.sh@712 -- # xtrace_disable 00:07:55.963 11:31:25 -- common/autotest_common.sh@10 -- # set +x 00:07:55.963 11:31:25 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:07:55.963 11:31:25 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:07:55.963 11:31:25 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:07:55.963 11:31:25 -- common/autotest_common.sh@10 -- # set +x 00:08:01.234 INFO: APP EXITING 00:08:01.234 INFO: killing all VMs 00:08:01.234 INFO: killing vhost app 00:08:01.234 INFO: EXIT DONE 00:08:04.554 Waiting for block devices as requested 00:08:04.554 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:04.554 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:04.554 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:04.554 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:04.813 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:04.813 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:04.813 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:05.072 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:05.072 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:05.072 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:05.330 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:05.330 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:05.330 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:05.589 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:05.589 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:05.589 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:05.847 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:09.136 Cleaning 00:08:09.136 Removing: /dev/shm/spdk_tgt_trace.pid2036899 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2034446 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2035695 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2036899 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2037532 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2037832 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2038153 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2038481 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2038757 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2039013 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2039295 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2039602 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2040469 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2043407 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2043797 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2044063 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2044276 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2044743 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2044854 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2045434 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2045570 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2045757 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2046014 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2046231 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2046318 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2046828 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2046979 00:08:09.136 Removing: /var/run/dpdk/spdk_pid2047261 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2047582 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2047764 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2047901 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2047970 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2048236 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2048519 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2048711 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2048897 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2049095 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2049387 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2049653 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2049934 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2050202 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2050396 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2050538 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2050799 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2051073 00:08:09.137 Removing: /var/run/dpdk/spdk_pid2051354 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2051620 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2051902 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2052049 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2052235 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2052487 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2052777 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2053043 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2053326 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2053532 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2053725 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2053900 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2054186 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2054460 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2054741 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2055009 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2055221 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2055373 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2055601 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2055875 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2056167 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2056436 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2056722 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2056895 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2057090 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2057301 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2057587 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2057788 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2058046 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2058753 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2059541 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2060088 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2060592 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2060922 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2061467 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2061993 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2062306 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2062841 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2063264 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2063674 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2064211 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2064508 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2065047 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2065459 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2065878 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2066429 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2066757 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2067267 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2067772 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2068097 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2068641 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2069026 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2069469 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2070015 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2070508 00:08:09.396 Removing: /var/run/dpdk/spdk_pid2070913 00:08:09.655 Removing: /var/run/dpdk/spdk_pid2071460 00:08:09.655 Removing: /var/run/dpdk/spdk_pid2072004 00:08:09.655 Removing: /var/run/dpdk/spdk_pid2072474 00:08:09.655 Removing: /var/run/dpdk/spdk_pid2072844 00:08:09.655 Removing: /var/run/dpdk/spdk_pid2073391 00:08:09.655 Clean 00:08:09.655 killing process with pid 1989957 00:08:13.843 killing process with pid 1989954 00:08:13.843 killing process with pid 1989956 00:08:13.843 killing process with pid 1989955 00:08:13.843 11:31:42 -- common/autotest_common.sh@1436 -- # return 0 00:08:13.843 11:31:42 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:08:13.843 11:31:42 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:13.843 11:31:42 -- common/autotest_common.sh@10 -- # set +x 00:08:13.843 11:31:42 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:08:13.843 11:31:42 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:13.843 11:31:42 -- common/autotest_common.sh@10 -- # set +x 00:08:13.843 11:31:42 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:13.843 11:31:42 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:13.843 11:31:42 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:13.843 11:31:42 -- spdk/autotest.sh@394 -- # hash lcov 00:08:13.843 11:31:42 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:08:13.843 11:31:42 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:13.843 11:31:42 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:08:13.843 11:31:42 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:13.843 11:31:42 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:13.843 11:31:42 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.843 11:31:42 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.843 11:31:42 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.843 11:31:42 -- paths/export.sh@5 -- $ export PATH 00:08:13.843 11:31:42 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.843 11:31:42 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:08:13.843 11:31:42 -- common/autobuild_common.sh@435 -- $ date +%s 00:08:13.843 11:31:42 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1721554302.XXXXXX 00:08:13.843 11:31:42 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1721554302.Ahnxdc 00:08:13.843 11:31:42 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:08:13.843 11:31:42 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:08:13.843 11:31:42 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:13.843 11:31:42 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:08:13.843 11:31:42 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:08:13.843 11:31:42 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:08:13.843 11:31:42 -- common/autobuild_common.sh@451 -- $ get_config_params 00:08:13.843 11:31:42 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:08:13.843 11:31:42 -- common/autotest_common.sh@10 -- $ set +x 00:08:13.843 11:31:42 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:08:13.843 11:31:42 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:08:13.843 11:31:42 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:13.843 11:31:42 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:08:13.843 11:31:42 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:08:13.843 11:31:42 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:08:13.843 11:31:42 -- spdk/autopackage.sh@19 -- $ timing_finish 00:08:13.843 11:31:42 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:08:13.843 11:31:42 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:08:13.843 11:31:42 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:13.843 11:31:42 -- spdk/autopackage.sh@20 -- $ exit 0 00:08:13.843 + [[ -n 1934331 ]] 00:08:13.843 + sudo kill 1934331 00:08:13.860 [Pipeline] } 00:08:13.876 [Pipeline] // stage 00:08:13.881 [Pipeline] } 00:08:13.896 [Pipeline] // timeout 00:08:13.901 [Pipeline] } 00:08:13.918 [Pipeline] // catchError 00:08:13.923 [Pipeline] } 00:08:13.939 [Pipeline] // wrap 00:08:13.945 [Pipeline] } 00:08:13.960 [Pipeline] // catchError 00:08:13.969 [Pipeline] stage 00:08:13.971 [Pipeline] { (Epilogue) 00:08:13.986 [Pipeline] catchError 00:08:13.987 [Pipeline] { 00:08:14.002 [Pipeline] echo 00:08:14.004 Cleanup processes 00:08:14.010 [Pipeline] sh 00:08:14.292 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:14.292 2082172 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:14.305 [Pipeline] sh 00:08:14.585 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:14.585 ++ grep -v 'sudo pgrep' 00:08:14.585 ++ awk '{print $1}' 00:08:14.585 + sudo kill -9 00:08:14.585 + true 00:08:14.597 [Pipeline] sh 00:08:14.878 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:08:14.878 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:14.878 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:15.822 [Pipeline] sh 00:08:16.103 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:08:16.103 Artifacts sizes are good 00:08:16.117 [Pipeline] archiveArtifacts 00:08:16.124 Archiving artifacts 00:08:16.174 [Pipeline] sh 00:08:16.528 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:08:16.543 [Pipeline] cleanWs 00:08:16.553 [WS-CLEANUP] Deleting project workspace... 00:08:16.553 [WS-CLEANUP] Deferred wipeout is used... 00:08:16.559 [WS-CLEANUP] done 00:08:16.560 [Pipeline] } 00:08:16.580 [Pipeline] // catchError 00:08:16.593 [Pipeline] sh 00:08:16.868 + logger -p user.info -t JENKINS-CI 00:08:16.877 [Pipeline] } 00:08:16.894 [Pipeline] // stage 00:08:16.900 [Pipeline] } 00:08:16.917 [Pipeline] // node 00:08:16.943 [Pipeline] End of Pipeline 00:08:16.994 Finished: SUCCESS