00:00:00.001 Started by upstream project "autotest-nightly-lts" build number 2419 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3680 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.011 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.012 The recommended git tool is: git 00:00:00.012 using credential 00000000-0000-0000-0000-000000000002 00:00:00.015 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.026 Fetching changes from the remote Git repository 00:00:00.028 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.039 Using shallow fetch with depth 1 00:00:00.039 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.039 > git --version # timeout=10 00:00:00.050 > git --version # 'git version 2.39.2' 00:00:00.050 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.071 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.071 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.555 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.567 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.579 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:02.579 > git config core.sparsecheckout # timeout=10 00:00:02.588 > git read-tree -mu HEAD # timeout=10 00:00:02.604 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:02.626 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:02.626 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:02.740 [Pipeline] Start of Pipeline 00:00:02.756 [Pipeline] library 00:00:02.757 Loading library shm_lib@master 00:00:02.758 Library shm_lib@master is cached. Copying from home. 00:00:02.776 [Pipeline] node 00:00:02.793 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.794 [Pipeline] { 00:00:02.805 [Pipeline] catchError 00:00:02.807 [Pipeline] { 00:00:02.823 [Pipeline] wrap 00:00:02.834 [Pipeline] { 00:00:02.844 [Pipeline] stage 00:00:02.847 [Pipeline] { (Prologue) 00:00:03.048 [Pipeline] sh 00:00:03.329 + logger -p user.info -t JENKINS-CI 00:00:03.348 [Pipeline] echo 00:00:03.350 Node: WFP20 00:00:03.358 [Pipeline] sh 00:00:03.654 [Pipeline] setCustomBuildProperty 00:00:03.667 [Pipeline] echo 00:00:03.669 Cleanup processes 00:00:03.674 [Pipeline] sh 00:00:03.956 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.956 2602567 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.968 [Pipeline] sh 00:00:04.247 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.247 ++ grep -v 'sudo pgrep' 00:00:04.247 ++ awk '{print $1}' 00:00:04.247 + sudo kill -9 00:00:04.247 + true 00:00:04.262 [Pipeline] cleanWs 00:00:04.274 [WS-CLEANUP] Deleting project workspace... 00:00:04.274 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.279 [WS-CLEANUP] done 00:00:04.283 [Pipeline] setCustomBuildProperty 00:00:04.296 [Pipeline] sh 00:00:04.575 + sudo git config --global --replace-all safe.directory '*' 00:00:04.669 [Pipeline] httpRequest 00:00:05.310 [Pipeline] echo 00:00:05.311 Sorcerer 10.211.164.20 is alive 00:00:05.321 [Pipeline] retry 00:00:05.322 [Pipeline] { 00:00:05.334 [Pipeline] httpRequest 00:00:05.337 HttpMethod: GET 00:00:05.338 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.338 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.342 Response Code: HTTP/1.1 200 OK 00:00:05.342 Success: Status code 200 is in the accepted range: 200,404 00:00:05.343 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.564 [Pipeline] } 00:00:06.578 [Pipeline] // retry 00:00:06.585 [Pipeline] sh 00:00:06.864 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.880 [Pipeline] httpRequest 00:00:07.219 [Pipeline] echo 00:00:07.221 Sorcerer 10.211.164.20 is alive 00:00:07.231 [Pipeline] retry 00:00:07.233 [Pipeline] { 00:00:07.246 [Pipeline] httpRequest 00:00:07.251 HttpMethod: GET 00:00:07.252 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:07.252 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:07.266 Response Code: HTTP/1.1 200 OK 00:00:07.266 Success: Status code 200 is in the accepted range: 200,404 00:00:07.266 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:58.839 [Pipeline] } 00:00:58.858 [Pipeline] // retry 00:00:58.865 [Pipeline] sh 00:00:59.153 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:01.705 [Pipeline] sh 00:01:01.996 + git -C spdk log --oneline -n5 00:01:01.996 c13c99a5e test: Various fixes for Fedora40 00:01:01.996 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:01.996 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:01.996 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:01.996 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:02.008 [Pipeline] } 00:01:02.023 [Pipeline] // stage 00:01:02.031 [Pipeline] stage 00:01:02.034 [Pipeline] { (Prepare) 00:01:02.051 [Pipeline] writeFile 00:01:02.066 [Pipeline] sh 00:01:02.351 + logger -p user.info -t JENKINS-CI 00:01:02.365 [Pipeline] sh 00:01:02.651 + logger -p user.info -t JENKINS-CI 00:01:02.663 [Pipeline] sh 00:01:02.949 + cat autorun-spdk.conf 00:01:02.949 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:02.949 SPDK_TEST_FUZZER_SHORT=1 00:01:02.949 SPDK_TEST_FUZZER=1 00:01:02.949 SPDK_RUN_UBSAN=1 00:01:02.957 RUN_NIGHTLY=1 00:01:02.962 [Pipeline] readFile 00:01:02.989 [Pipeline] withEnv 00:01:02.991 [Pipeline] { 00:01:03.003 [Pipeline] sh 00:01:03.289 + set -ex 00:01:03.289 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:03.289 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:03.289 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.289 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:03.289 ++ SPDK_TEST_FUZZER=1 00:01:03.289 ++ SPDK_RUN_UBSAN=1 00:01:03.289 ++ RUN_NIGHTLY=1 00:01:03.289 + case $SPDK_TEST_NVMF_NICS in 00:01:03.289 + DRIVERS= 00:01:03.289 + [[ -n '' ]] 00:01:03.289 + exit 0 00:01:03.299 [Pipeline] } 00:01:03.314 [Pipeline] // withEnv 00:01:03.319 [Pipeline] } 00:01:03.332 [Pipeline] // stage 00:01:03.341 [Pipeline] catchError 00:01:03.343 [Pipeline] { 00:01:03.356 [Pipeline] timeout 00:01:03.357 Timeout set to expire in 30 min 00:01:03.359 [Pipeline] { 00:01:03.373 [Pipeline] stage 00:01:03.375 [Pipeline] { (Tests) 00:01:03.389 [Pipeline] sh 00:01:03.675 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:03.675 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:03.675 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:03.675 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:03.675 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:03.675 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:03.675 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:03.675 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:03.675 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:03.675 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:03.675 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:03.675 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:03.675 + source /etc/os-release 00:01:03.675 ++ NAME='Fedora Linux' 00:01:03.675 ++ VERSION='39 (Cloud Edition)' 00:01:03.675 ++ ID=fedora 00:01:03.675 ++ VERSION_ID=39 00:01:03.675 ++ VERSION_CODENAME= 00:01:03.675 ++ PLATFORM_ID=platform:f39 00:01:03.675 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:03.675 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:03.675 ++ LOGO=fedora-logo-icon 00:01:03.675 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:03.675 ++ HOME_URL=https://fedoraproject.org/ 00:01:03.675 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:03.675 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:03.675 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:03.675 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:03.675 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:03.675 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:03.675 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:03.675 ++ SUPPORT_END=2024-11-12 00:01:03.675 ++ VARIANT='Cloud Edition' 00:01:03.675 ++ VARIANT_ID=cloud 00:01:03.675 + uname -a 00:01:03.675 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:03.675 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:06.971 Hugepages 00:01:06.971 node hugesize free / total 00:01:06.971 node0 1048576kB 0 / 0 00:01:06.971 node0 2048kB 0 / 0 00:01:06.971 node1 1048576kB 0 / 0 00:01:06.971 node1 2048kB 0 / 0 00:01:06.971 00:01:06.971 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:06.971 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:06.971 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:06.971 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:06.971 + rm -f /tmp/spdk-ld-path 00:01:06.971 + source autorun-spdk.conf 00:01:06.971 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:06.971 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:06.971 ++ SPDK_TEST_FUZZER=1 00:01:06.971 ++ SPDK_RUN_UBSAN=1 00:01:06.971 ++ RUN_NIGHTLY=1 00:01:06.971 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:06.971 + [[ -n '' ]] 00:01:06.971 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:06.971 + for M in /var/spdk/build-*-manifest.txt 00:01:06.971 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:06.971 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:06.971 + for M in /var/spdk/build-*-manifest.txt 00:01:06.971 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:06.971 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:06.971 + for M in /var/spdk/build-*-manifest.txt 00:01:06.971 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:06.971 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:06.971 ++ uname 00:01:06.971 + [[ Linux == \L\i\n\u\x ]] 00:01:06.971 + sudo dmesg -T 00:01:06.971 + sudo dmesg --clear 00:01:06.971 + dmesg_pid=2603456 00:01:06.971 + [[ Fedora Linux == FreeBSD ]] 00:01:06.971 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:06.971 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:06.971 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:06.971 + [[ -x /usr/src/fio-static/fio ]] 00:01:06.971 + sudo dmesg -Tw 00:01:06.971 + export FIO_BIN=/usr/src/fio-static/fio 00:01:06.971 + FIO_BIN=/usr/src/fio-static/fio 00:01:06.971 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:06.971 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:06.971 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:06.971 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:06.971 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:06.971 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:06.971 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:06.971 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:06.971 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:06.971 Test configuration: 00:01:06.971 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:06.971 SPDK_TEST_FUZZER_SHORT=1 00:01:06.971 SPDK_TEST_FUZZER=1 00:01:06.971 SPDK_RUN_UBSAN=1 00:01:06.971 RUN_NIGHTLY=1 23:59:32 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:06.971 23:59:32 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:06.971 23:59:32 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:06.971 23:59:32 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:06.971 23:59:32 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:06.971 23:59:32 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:06.971 23:59:32 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:06.971 23:59:32 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:06.971 23:59:32 -- paths/export.sh@5 -- $ export PATH 00:01:06.971 23:59:32 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:06.971 23:59:32 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:06.971 23:59:32 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:06.971 23:59:32 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732921172.XXXXXX 00:01:06.971 23:59:32 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732921172.uteNvd 00:01:06.971 23:59:32 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:06.971 23:59:32 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:01:06.971 23:59:32 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:01:06.971 23:59:32 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:06.972 23:59:32 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:06.972 23:59:32 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:06.972 23:59:32 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:06.972 23:59:32 -- common/autotest_common.sh@10 -- $ set +x 00:01:06.972 23:59:32 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:01:06.972 23:59:32 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:06.972 23:59:32 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:06.972 23:59:32 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:06.972 23:59:32 -- spdk/autobuild.sh@16 -- $ date -u 00:01:06.972 Fri Nov 29 10:59:32 PM UTC 2024 00:01:06.972 23:59:32 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:06.972 LTS-67-gc13c99a5e 00:01:06.972 23:59:32 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:06.972 23:59:32 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:06.972 23:59:32 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:06.972 23:59:32 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:06.972 23:59:32 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:06.972 23:59:32 -- common/autotest_common.sh@10 -- $ set +x 00:01:06.972 ************************************ 00:01:06.972 START TEST ubsan 00:01:06.972 ************************************ 00:01:06.972 23:59:32 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:06.972 using ubsan 00:01:06.972 00:01:06.972 real 0m0.000s 00:01:06.972 user 0m0.000s 00:01:06.972 sys 0m0.000s 00:01:06.972 23:59:32 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:06.972 23:59:32 -- common/autotest_common.sh@10 -- $ set +x 00:01:06.972 ************************************ 00:01:06.972 END TEST ubsan 00:01:06.972 ************************************ 00:01:06.972 23:59:32 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:06.972 23:59:32 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:06.972 23:59:32 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:06.972 23:59:32 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:06.972 23:59:32 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:06.972 23:59:32 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:06.972 23:59:32 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:06.972 23:59:32 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:06.972 23:59:32 -- common/autotest_common.sh@10 -- $ set +x 00:01:06.972 ************************************ 00:01:06.972 START TEST autobuild_llvm_precompile 00:01:06.972 ************************************ 00:01:06.972 23:59:32 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:06.972 23:59:32 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:06.972 23:59:32 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:06.972 Target: x86_64-redhat-linux-gnu 00:01:06.972 Thread model: posix 00:01:06.972 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:06.972 23:59:32 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:06.972 23:59:32 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:06.972 23:59:32 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:06.972 23:59:32 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:06.972 23:59:32 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:06.972 23:59:32 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:06.972 23:59:32 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:06.972 23:59:32 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:06.972 23:59:32 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:06.972 23:59:32 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:07.232 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:07.232 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:07.801 Using 'verbs' RDMA provider 00:01:23.271 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:35.495 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:35.495 Creating mk/config.mk...done. 00:01:35.495 Creating mk/cc.flags.mk...done. 00:01:35.495 Type 'make' to build. 00:01:35.495 00:01:35.495 real 0m28.369s 00:01:35.495 user 0m12.377s 00:01:35.495 sys 0m15.347s 00:01:35.495 00:00:00 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:35.495 00:00:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:35.495 ************************************ 00:01:35.495 END TEST autobuild_llvm_precompile 00:01:35.495 ************************************ 00:01:35.495 00:00:00 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:35.495 00:00:00 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:35.495 00:00:00 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:35.495 00:00:00 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:35.495 00:00:00 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:35.754 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:35.754 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:36.014 Using 'verbs' RDMA provider 00:01:48.816 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:01.053 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:01.053 Creating mk/config.mk...done. 00:02:01.053 Creating mk/cc.flags.mk...done. 00:02:01.053 Type 'make' to build. 00:02:01.053 00:00:25 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:01.053 00:00:25 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:01.053 00:00:25 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:01.053 00:00:25 -- common/autotest_common.sh@10 -- $ set +x 00:02:01.053 ************************************ 00:02:01.053 START TEST make 00:02:01.053 ************************************ 00:02:01.053 00:00:25 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:01.053 make[1]: Nothing to be done for 'all'. 00:02:01.986 The Meson build system 00:02:01.986 Version: 1.5.0 00:02:01.986 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:01.986 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:01.986 Build type: native build 00:02:01.986 Project name: libvfio-user 00:02:01.986 Project version: 0.0.1 00:02:01.986 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:01.986 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:01.986 Host machine cpu family: x86_64 00:02:01.986 Host machine cpu: x86_64 00:02:01.986 Run-time dependency threads found: YES 00:02:01.986 Library dl found: YES 00:02:01.986 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:01.986 Run-time dependency json-c found: YES 0.17 00:02:01.986 Run-time dependency cmocka found: YES 1.1.7 00:02:01.986 Program pytest-3 found: NO 00:02:01.986 Program flake8 found: NO 00:02:01.986 Program misspell-fixer found: NO 00:02:01.986 Program restructuredtext-lint found: NO 00:02:01.986 Program valgrind found: YES (/usr/bin/valgrind) 00:02:01.986 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:01.986 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:01.986 Compiler for C supports arguments -Wwrite-strings: YES 00:02:01.986 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:01.986 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:01.986 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:01.986 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:01.986 Build targets in project: 8 00:02:01.986 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:01.986 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:01.986 00:02:01.986 libvfio-user 0.0.1 00:02:01.986 00:02:01.986 User defined options 00:02:01.986 buildtype : debug 00:02:01.986 default_library: static 00:02:01.986 libdir : /usr/local/lib 00:02:01.986 00:02:01.986 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:02.243 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:02.243 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:02.243 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:02.243 [3/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:02.243 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:02.243 [5/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:02.243 [6/36] Compiling C object samples/null.p/null.c.o 00:02:02.243 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:02.243 [8/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:02.243 [9/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:02.243 [10/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:02.243 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:02.243 [12/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:02.243 [13/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:02.243 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:02.243 [15/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:02.243 [16/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:02.243 [17/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:02.243 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:02.243 [19/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:02.243 [20/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:02.243 [21/36] Compiling C object samples/server.p/server.c.o 00:02:02.243 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:02.243 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:02.243 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:02.243 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:02.243 [26/36] Compiling C object samples/client.p/client.c.o 00:02:02.243 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:02.243 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:02.243 [29/36] Linking static target lib/libvfio-user.a 00:02:02.500 [30/36] Linking target samples/client 00:02:02.500 [31/36] Linking target samples/server 00:02:02.500 [32/36] Linking target samples/lspci 00:02:02.500 [33/36] Linking target samples/shadow_ioeventfd_server 00:02:02.500 [34/36] Linking target samples/null 00:02:02.500 [35/36] Linking target test/unit_tests 00:02:02.500 [36/36] Linking target samples/gpio-pci-idio-16 00:02:02.500 INFO: autodetecting backend as ninja 00:02:02.500 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:02.500 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:02.758 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:02.758 ninja: no work to do. 00:02:08.054 The Meson build system 00:02:08.054 Version: 1.5.0 00:02:08.054 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:02:08.054 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:02:08.054 Build type: native build 00:02:08.054 Program cat found: YES (/usr/bin/cat) 00:02:08.054 Project name: DPDK 00:02:08.054 Project version: 23.11.0 00:02:08.054 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:08.054 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:08.054 Host machine cpu family: x86_64 00:02:08.054 Host machine cpu: x86_64 00:02:08.054 Message: ## Building in Developer Mode ## 00:02:08.054 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:08.054 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:08.054 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:08.054 Program python3 found: YES (/usr/bin/python3) 00:02:08.054 Program cat found: YES (/usr/bin/cat) 00:02:08.054 Compiler for C supports arguments -march=native: YES 00:02:08.054 Checking for size of "void *" : 8 00:02:08.054 Checking for size of "void *" : 8 (cached) 00:02:08.054 Library m found: YES 00:02:08.054 Library numa found: YES 00:02:08.054 Has header "numaif.h" : YES 00:02:08.054 Library fdt found: NO 00:02:08.054 Library execinfo found: NO 00:02:08.054 Has header "execinfo.h" : YES 00:02:08.054 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:08.054 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:08.054 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:08.054 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:08.054 Run-time dependency openssl found: YES 3.1.1 00:02:08.054 Run-time dependency libpcap found: YES 1.10.4 00:02:08.054 Has header "pcap.h" with dependency libpcap: YES 00:02:08.054 Compiler for C supports arguments -Wcast-qual: YES 00:02:08.054 Compiler for C supports arguments -Wdeprecated: YES 00:02:08.054 Compiler for C supports arguments -Wformat: YES 00:02:08.054 Compiler for C supports arguments -Wformat-nonliteral: YES 00:02:08.054 Compiler for C supports arguments -Wformat-security: YES 00:02:08.054 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:08.054 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:08.054 Compiler for C supports arguments -Wnested-externs: YES 00:02:08.054 Compiler for C supports arguments -Wold-style-definition: YES 00:02:08.054 Compiler for C supports arguments -Wpointer-arith: YES 00:02:08.054 Compiler for C supports arguments -Wsign-compare: YES 00:02:08.054 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:08.054 Compiler for C supports arguments -Wundef: YES 00:02:08.054 Compiler for C supports arguments -Wwrite-strings: YES 00:02:08.054 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:08.054 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:02:08.054 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:08.054 Program objdump found: YES (/usr/bin/objdump) 00:02:08.054 Compiler for C supports arguments -mavx512f: YES 00:02:08.054 Checking if "AVX512 checking" compiles: YES 00:02:08.054 Fetching value of define "__SSE4_2__" : 1 00:02:08.054 Fetching value of define "__AES__" : 1 00:02:08.054 Fetching value of define "__AVX__" : 1 00:02:08.054 Fetching value of define "__AVX2__" : 1 00:02:08.054 Fetching value of define "__AVX512BW__" : 1 00:02:08.054 Fetching value of define "__AVX512CD__" : 1 00:02:08.054 Fetching value of define "__AVX512DQ__" : 1 00:02:08.054 Fetching value of define "__AVX512F__" : 1 00:02:08.054 Fetching value of define "__AVX512VL__" : 1 00:02:08.054 Fetching value of define "__PCLMUL__" : 1 00:02:08.054 Fetching value of define "__RDRND__" : 1 00:02:08.054 Fetching value of define "__RDSEED__" : 1 00:02:08.054 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:08.054 Fetching value of define "__znver1__" : (undefined) 00:02:08.054 Fetching value of define "__znver2__" : (undefined) 00:02:08.054 Fetching value of define "__znver3__" : (undefined) 00:02:08.054 Fetching value of define "__znver4__" : (undefined) 00:02:08.054 Compiler for C supports arguments -Wno-format-truncation: NO 00:02:08.054 Message: lib/log: Defining dependency "log" 00:02:08.054 Message: lib/kvargs: Defining dependency "kvargs" 00:02:08.054 Message: lib/telemetry: Defining dependency "telemetry" 00:02:08.054 Checking for function "getentropy" : NO 00:02:08.054 Message: lib/eal: Defining dependency "eal" 00:02:08.054 Message: lib/ring: Defining dependency "ring" 00:02:08.054 Message: lib/rcu: Defining dependency "rcu" 00:02:08.054 Message: lib/mempool: Defining dependency "mempool" 00:02:08.054 Message: lib/mbuf: Defining dependency "mbuf" 00:02:08.054 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:08.054 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:08.054 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:08.054 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:08.054 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:08.054 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:08.054 Compiler for C supports arguments -mpclmul: YES 00:02:08.054 Compiler for C supports arguments -maes: YES 00:02:08.054 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:08.054 Compiler for C supports arguments -mavx512bw: YES 00:02:08.054 Compiler for C supports arguments -mavx512dq: YES 00:02:08.054 Compiler for C supports arguments -mavx512vl: YES 00:02:08.054 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:08.054 Compiler for C supports arguments -mavx2: YES 00:02:08.054 Compiler for C supports arguments -mavx: YES 00:02:08.054 Message: lib/net: Defining dependency "net" 00:02:08.054 Message: lib/meter: Defining dependency "meter" 00:02:08.054 Message: lib/ethdev: Defining dependency "ethdev" 00:02:08.054 Message: lib/pci: Defining dependency "pci" 00:02:08.054 Message: lib/cmdline: Defining dependency "cmdline" 00:02:08.054 Message: lib/hash: Defining dependency "hash" 00:02:08.054 Message: lib/timer: Defining dependency "timer" 00:02:08.054 Message: lib/compressdev: Defining dependency "compressdev" 00:02:08.054 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:08.054 Message: lib/dmadev: Defining dependency "dmadev" 00:02:08.054 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:08.054 Message: lib/power: Defining dependency "power" 00:02:08.054 Message: lib/reorder: Defining dependency "reorder" 00:02:08.054 Message: lib/security: Defining dependency "security" 00:02:08.054 Has header "linux/userfaultfd.h" : YES 00:02:08.054 Has header "linux/vduse.h" : YES 00:02:08.054 Message: lib/vhost: Defining dependency "vhost" 00:02:08.054 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:02:08.054 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:08.054 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:08.054 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:08.054 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:08.054 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:08.054 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:08.054 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:08.054 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:08.054 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:08.054 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:08.054 Configuring doxy-api-html.conf using configuration 00:02:08.054 Configuring doxy-api-man.conf using configuration 00:02:08.054 Program mandb found: YES (/usr/bin/mandb) 00:02:08.054 Program sphinx-build found: NO 00:02:08.054 Configuring rte_build_config.h using configuration 00:02:08.054 Message: 00:02:08.054 ================= 00:02:08.054 Applications Enabled 00:02:08.054 ================= 00:02:08.054 00:02:08.054 apps: 00:02:08.054 00:02:08.054 00:02:08.054 Message: 00:02:08.054 ================= 00:02:08.054 Libraries Enabled 00:02:08.054 ================= 00:02:08.054 00:02:08.054 libs: 00:02:08.054 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:08.054 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:08.054 cryptodev, dmadev, power, reorder, security, vhost, 00:02:08.054 00:02:08.054 Message: 00:02:08.054 =============== 00:02:08.054 Drivers Enabled 00:02:08.054 =============== 00:02:08.054 00:02:08.054 common: 00:02:08.054 00:02:08.054 bus: 00:02:08.054 pci, vdev, 00:02:08.054 mempool: 00:02:08.054 ring, 00:02:08.054 dma: 00:02:08.054 00:02:08.054 net: 00:02:08.054 00:02:08.054 crypto: 00:02:08.054 00:02:08.054 compress: 00:02:08.054 00:02:08.054 vdpa: 00:02:08.054 00:02:08.054 00:02:08.054 Message: 00:02:08.055 ================= 00:02:08.055 Content Skipped 00:02:08.055 ================= 00:02:08.055 00:02:08.055 apps: 00:02:08.055 dumpcap: explicitly disabled via build config 00:02:08.055 graph: explicitly disabled via build config 00:02:08.055 pdump: explicitly disabled via build config 00:02:08.055 proc-info: explicitly disabled via build config 00:02:08.055 test-acl: explicitly disabled via build config 00:02:08.055 test-bbdev: explicitly disabled via build config 00:02:08.055 test-cmdline: explicitly disabled via build config 00:02:08.055 test-compress-perf: explicitly disabled via build config 00:02:08.055 test-crypto-perf: explicitly disabled via build config 00:02:08.055 test-dma-perf: explicitly disabled via build config 00:02:08.055 test-eventdev: explicitly disabled via build config 00:02:08.055 test-fib: explicitly disabled via build config 00:02:08.055 test-flow-perf: explicitly disabled via build config 00:02:08.055 test-gpudev: explicitly disabled via build config 00:02:08.055 test-mldev: explicitly disabled via build config 00:02:08.055 test-pipeline: explicitly disabled via build config 00:02:08.055 test-pmd: explicitly disabled via build config 00:02:08.055 test-regex: explicitly disabled via build config 00:02:08.055 test-sad: explicitly disabled via build config 00:02:08.055 test-security-perf: explicitly disabled via build config 00:02:08.055 00:02:08.055 libs: 00:02:08.055 metrics: explicitly disabled via build config 00:02:08.055 acl: explicitly disabled via build config 00:02:08.055 bbdev: explicitly disabled via build config 00:02:08.055 bitratestats: explicitly disabled via build config 00:02:08.055 bpf: explicitly disabled via build config 00:02:08.055 cfgfile: explicitly disabled via build config 00:02:08.055 distributor: explicitly disabled via build config 00:02:08.055 efd: explicitly disabled via build config 00:02:08.055 eventdev: explicitly disabled via build config 00:02:08.055 dispatcher: explicitly disabled via build config 00:02:08.055 gpudev: explicitly disabled via build config 00:02:08.055 gro: explicitly disabled via build config 00:02:08.055 gso: explicitly disabled via build config 00:02:08.055 ip_frag: explicitly disabled via build config 00:02:08.055 jobstats: explicitly disabled via build config 00:02:08.055 latencystats: explicitly disabled via build config 00:02:08.055 lpm: explicitly disabled via build config 00:02:08.055 member: explicitly disabled via build config 00:02:08.055 pcapng: explicitly disabled via build config 00:02:08.055 rawdev: explicitly disabled via build config 00:02:08.055 regexdev: explicitly disabled via build config 00:02:08.055 mldev: explicitly disabled via build config 00:02:08.055 rib: explicitly disabled via build config 00:02:08.055 sched: explicitly disabled via build config 00:02:08.055 stack: explicitly disabled via build config 00:02:08.055 ipsec: explicitly disabled via build config 00:02:08.055 pdcp: explicitly disabled via build config 00:02:08.055 fib: explicitly disabled via build config 00:02:08.055 port: explicitly disabled via build config 00:02:08.055 pdump: explicitly disabled via build config 00:02:08.055 table: explicitly disabled via build config 00:02:08.055 pipeline: explicitly disabled via build config 00:02:08.055 graph: explicitly disabled via build config 00:02:08.055 node: explicitly disabled via build config 00:02:08.055 00:02:08.055 drivers: 00:02:08.055 common/cpt: not in enabled drivers build config 00:02:08.055 common/dpaax: not in enabled drivers build config 00:02:08.055 common/iavf: not in enabled drivers build config 00:02:08.055 common/idpf: not in enabled drivers build config 00:02:08.055 common/mvep: not in enabled drivers build config 00:02:08.055 common/octeontx: not in enabled drivers build config 00:02:08.055 bus/auxiliary: not in enabled drivers build config 00:02:08.055 bus/cdx: not in enabled drivers build config 00:02:08.055 bus/dpaa: not in enabled drivers build config 00:02:08.055 bus/fslmc: not in enabled drivers build config 00:02:08.055 bus/ifpga: not in enabled drivers build config 00:02:08.055 bus/platform: not in enabled drivers build config 00:02:08.055 bus/vmbus: not in enabled drivers build config 00:02:08.055 common/cnxk: not in enabled drivers build config 00:02:08.055 common/mlx5: not in enabled drivers build config 00:02:08.055 common/nfp: not in enabled drivers build config 00:02:08.055 common/qat: not in enabled drivers build config 00:02:08.055 common/sfc_efx: not in enabled drivers build config 00:02:08.055 mempool/bucket: not in enabled drivers build config 00:02:08.055 mempool/cnxk: not in enabled drivers build config 00:02:08.055 mempool/dpaa: not in enabled drivers build config 00:02:08.055 mempool/dpaa2: not in enabled drivers build config 00:02:08.055 mempool/octeontx: not in enabled drivers build config 00:02:08.055 mempool/stack: not in enabled drivers build config 00:02:08.055 dma/cnxk: not in enabled drivers build config 00:02:08.055 dma/dpaa: not in enabled drivers build config 00:02:08.055 dma/dpaa2: not in enabled drivers build config 00:02:08.055 dma/hisilicon: not in enabled drivers build config 00:02:08.055 dma/idxd: not in enabled drivers build config 00:02:08.055 dma/ioat: not in enabled drivers build config 00:02:08.055 dma/skeleton: not in enabled drivers build config 00:02:08.055 net/af_packet: not in enabled drivers build config 00:02:08.055 net/af_xdp: not in enabled drivers build config 00:02:08.055 net/ark: not in enabled drivers build config 00:02:08.055 net/atlantic: not in enabled drivers build config 00:02:08.055 net/avp: not in enabled drivers build config 00:02:08.055 net/axgbe: not in enabled drivers build config 00:02:08.055 net/bnx2x: not in enabled drivers build config 00:02:08.055 net/bnxt: not in enabled drivers build config 00:02:08.055 net/bonding: not in enabled drivers build config 00:02:08.055 net/cnxk: not in enabled drivers build config 00:02:08.055 net/cpfl: not in enabled drivers build config 00:02:08.055 net/cxgbe: not in enabled drivers build config 00:02:08.055 net/dpaa: not in enabled drivers build config 00:02:08.055 net/dpaa2: not in enabled drivers build config 00:02:08.055 net/e1000: not in enabled drivers build config 00:02:08.055 net/ena: not in enabled drivers build config 00:02:08.055 net/enetc: not in enabled drivers build config 00:02:08.055 net/enetfec: not in enabled drivers build config 00:02:08.055 net/enic: not in enabled drivers build config 00:02:08.055 net/failsafe: not in enabled drivers build config 00:02:08.055 net/fm10k: not in enabled drivers build config 00:02:08.055 net/gve: not in enabled drivers build config 00:02:08.055 net/hinic: not in enabled drivers build config 00:02:08.055 net/hns3: not in enabled drivers build config 00:02:08.055 net/i40e: not in enabled drivers build config 00:02:08.055 net/iavf: not in enabled drivers build config 00:02:08.055 net/ice: not in enabled drivers build config 00:02:08.055 net/idpf: not in enabled drivers build config 00:02:08.055 net/igc: not in enabled drivers build config 00:02:08.055 net/ionic: not in enabled drivers build config 00:02:08.055 net/ipn3ke: not in enabled drivers build config 00:02:08.055 net/ixgbe: not in enabled drivers build config 00:02:08.055 net/mana: not in enabled drivers build config 00:02:08.055 net/memif: not in enabled drivers build config 00:02:08.055 net/mlx4: not in enabled drivers build config 00:02:08.055 net/mlx5: not in enabled drivers build config 00:02:08.055 net/mvneta: not in enabled drivers build config 00:02:08.055 net/mvpp2: not in enabled drivers build config 00:02:08.055 net/netvsc: not in enabled drivers build config 00:02:08.055 net/nfb: not in enabled drivers build config 00:02:08.055 net/nfp: not in enabled drivers build config 00:02:08.055 net/ngbe: not in enabled drivers build config 00:02:08.055 net/null: not in enabled drivers build config 00:02:08.055 net/octeontx: not in enabled drivers build config 00:02:08.055 net/octeon_ep: not in enabled drivers build config 00:02:08.055 net/pcap: not in enabled drivers build config 00:02:08.055 net/pfe: not in enabled drivers build config 00:02:08.055 net/qede: not in enabled drivers build config 00:02:08.055 net/ring: not in enabled drivers build config 00:02:08.055 net/sfc: not in enabled drivers build config 00:02:08.055 net/softnic: not in enabled drivers build config 00:02:08.055 net/tap: not in enabled drivers build config 00:02:08.055 net/thunderx: not in enabled drivers build config 00:02:08.055 net/txgbe: not in enabled drivers build config 00:02:08.055 net/vdev_netvsc: not in enabled drivers build config 00:02:08.055 net/vhost: not in enabled drivers build config 00:02:08.055 net/virtio: not in enabled drivers build config 00:02:08.055 net/vmxnet3: not in enabled drivers build config 00:02:08.055 raw/*: missing internal dependency, "rawdev" 00:02:08.055 crypto/armv8: not in enabled drivers build config 00:02:08.055 crypto/bcmfs: not in enabled drivers build config 00:02:08.055 crypto/caam_jr: not in enabled drivers build config 00:02:08.055 crypto/ccp: not in enabled drivers build config 00:02:08.055 crypto/cnxk: not in enabled drivers build config 00:02:08.055 crypto/dpaa_sec: not in enabled drivers build config 00:02:08.055 crypto/dpaa2_sec: not in enabled drivers build config 00:02:08.055 crypto/ipsec_mb: not in enabled drivers build config 00:02:08.055 crypto/mlx5: not in enabled drivers build config 00:02:08.055 crypto/mvsam: not in enabled drivers build config 00:02:08.055 crypto/nitrox: not in enabled drivers build config 00:02:08.055 crypto/null: not in enabled drivers build config 00:02:08.055 crypto/octeontx: not in enabled drivers build config 00:02:08.055 crypto/openssl: not in enabled drivers build config 00:02:08.055 crypto/scheduler: not in enabled drivers build config 00:02:08.055 crypto/uadk: not in enabled drivers build config 00:02:08.055 crypto/virtio: not in enabled drivers build config 00:02:08.055 compress/isal: not in enabled drivers build config 00:02:08.055 compress/mlx5: not in enabled drivers build config 00:02:08.055 compress/octeontx: not in enabled drivers build config 00:02:08.055 compress/zlib: not in enabled drivers build config 00:02:08.055 regex/*: missing internal dependency, "regexdev" 00:02:08.055 ml/*: missing internal dependency, "mldev" 00:02:08.055 vdpa/ifc: not in enabled drivers build config 00:02:08.055 vdpa/mlx5: not in enabled drivers build config 00:02:08.055 vdpa/nfp: not in enabled drivers build config 00:02:08.055 vdpa/sfc: not in enabled drivers build config 00:02:08.055 event/*: missing internal dependency, "eventdev" 00:02:08.055 baseband/*: missing internal dependency, "bbdev" 00:02:08.055 gpu/*: missing internal dependency, "gpudev" 00:02:08.055 00:02:08.055 00:02:08.055 Build targets in project: 85 00:02:08.055 00:02:08.055 DPDK 23.11.0 00:02:08.055 00:02:08.055 User defined options 00:02:08.056 buildtype : debug 00:02:08.056 default_library : static 00:02:08.056 libdir : lib 00:02:08.056 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:08.056 c_args : -fPIC -Werror 00:02:08.056 c_link_args : 00:02:08.056 cpu_instruction_set: native 00:02:08.056 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:08.056 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,pcapng,bbdev 00:02:08.056 enable_docs : false 00:02:08.056 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:02:08.056 enable_kmods : false 00:02:08.056 tests : false 00:02:08.056 00:02:08.056 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:08.331 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:02:08.331 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:08.331 [2/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:08.331 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:08.331 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:08.331 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:08.331 [6/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:08.331 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:08.331 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:08.331 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:08.331 [10/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:08.331 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:08.331 [12/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:08.331 [13/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:08.331 [14/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:08.331 [15/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:08.331 [16/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:08.331 [17/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:08.331 [18/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:08.331 [19/265] Linking static target lib/librte_kvargs.a 00:02:08.331 [20/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:08.331 [21/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:08.331 [22/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:08.331 [23/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:08.331 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:08.331 [25/265] Linking static target lib/librte_log.a 00:02:08.331 [26/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:08.331 [27/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:08.331 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:08.331 [29/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:08.331 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:08.331 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:08.331 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:08.331 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:08.331 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:08.331 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:08.331 [36/265] Linking static target lib/librte_pci.a 00:02:08.331 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:08.331 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:08.615 [39/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:08.615 [40/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:08.615 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:08.615 [42/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.615 [43/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.884 [44/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:08.884 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:08.884 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:08.884 [47/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:08.884 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:08.884 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:08.884 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:08.884 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:08.884 [52/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:08.884 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:08.884 [54/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:08.884 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:08.884 [56/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:08.884 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:08.884 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:08.884 [59/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:08.884 [60/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:08.884 [61/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:08.884 [62/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:08.884 [63/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:08.884 [64/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:08.884 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:08.884 [66/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:08.884 [67/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:08.884 [68/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:08.884 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:08.884 [70/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:08.884 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:08.884 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:08.884 [73/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:08.884 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:08.884 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:08.884 [76/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:08.884 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:08.884 [78/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:08.884 [79/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:08.884 [80/265] Linking static target lib/librte_telemetry.a 00:02:08.884 [81/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:08.884 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:08.884 [83/265] Linking static target lib/librte_meter.a 00:02:08.884 [84/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:08.884 [85/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:08.884 [86/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:08.884 [87/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:08.884 [88/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:08.884 [89/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:08.884 [90/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:08.884 [91/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:08.884 [92/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:08.884 [93/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:08.884 [94/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:08.884 [95/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:08.884 [96/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:08.884 [97/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:08.884 [98/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:08.884 [99/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:08.884 [100/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:08.884 [101/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:08.884 [102/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:08.884 [103/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:08.884 [104/265] Linking static target lib/librte_ring.a 00:02:08.884 [105/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:08.884 [106/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:08.885 [107/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:08.885 [108/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:08.885 [109/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:08.885 [110/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:08.885 [111/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:08.885 [112/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:08.885 [113/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:08.885 [114/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:08.885 [115/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:08.885 [116/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:08.885 [117/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:08.885 [118/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:08.885 [119/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:08.885 [120/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.885 [121/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:08.885 [122/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:08.885 [123/265] Linking static target lib/librte_timer.a 00:02:08.885 [124/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:08.885 [125/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:08.885 [126/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:08.885 [127/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:08.885 [128/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:08.885 [129/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:08.885 [130/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:08.885 [131/265] Linking static target lib/librte_cmdline.a 00:02:08.885 [132/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:08.885 [133/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:08.885 [134/265] Linking static target lib/librte_eal.a 00:02:08.885 [135/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:08.885 [136/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:08.885 [137/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:08.885 [138/265] Linking static target lib/librte_compressdev.a 00:02:08.885 [139/265] Linking static target lib/librte_net.a 00:02:08.885 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:08.885 [141/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:08.885 [142/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:08.885 [143/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:08.885 [144/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:08.885 [145/265] Linking target lib/librte_log.so.24.0 00:02:08.885 [146/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:09.143 [147/265] Linking static target lib/librte_mempool.a 00:02:09.143 [148/265] Linking static target lib/librte_dmadev.a 00:02:09.143 [149/265] Linking static target lib/librte_rcu.a 00:02:09.143 [150/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:09.143 [151/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:09.143 [152/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:09.143 [153/265] Linking static target lib/librte_mbuf.a 00:02:09.143 [154/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:09.143 [155/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:09.143 [156/265] Linking static target lib/librte_power.a 00:02:09.143 [157/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:09.143 [158/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:09.143 [159/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:09.143 [160/265] Linking static target lib/librte_security.a 00:02:09.143 [161/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:09.143 [162/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:09.143 [163/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:09.143 [164/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:09.143 [165/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:09.143 [166/265] Linking static target lib/librte_reorder.a 00:02:09.143 [167/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:09.143 [168/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:09.143 [169/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.143 [170/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:09.143 [171/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:09.143 [172/265] Linking static target lib/librte_hash.a 00:02:09.143 [173/265] Linking target lib/librte_kvargs.so.24.0 00:02:09.143 [174/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:09.143 [175/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:09.143 [176/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:09.143 [177/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:09.143 [178/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:09.143 [179/265] Linking static target lib/librte_cryptodev.a 00:02:09.143 [180/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:09.402 [181/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:09.402 [182/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:09.402 [183/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.402 [184/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:09.402 [185/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:09.402 [186/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:09.402 [187/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:09.402 [188/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:09.402 [189/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:09.402 [190/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:09.402 [191/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:09.402 [192/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.402 [193/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:09.402 [194/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.402 [195/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.402 [196/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.402 [197/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:09.403 [198/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:09.403 [199/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:09.403 [200/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:09.403 [201/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:09.403 [202/265] Linking static target drivers/librte_bus_vdev.a 00:02:09.666 [203/265] Linking target lib/librte_telemetry.so.24.0 00:02:09.666 [204/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:09.666 [205/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.666 [206/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:09.666 [207/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:09.666 [208/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:09.666 [209/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:09.666 [210/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:09.666 [211/265] Linking static target lib/librte_ethdev.a 00:02:09.666 [212/265] Linking static target drivers/librte_mempool_ring.a 00:02:09.666 [213/265] Linking static target drivers/librte_bus_pci.a 00:02:09.666 [214/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.666 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:09.666 [216/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.666 [217/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.925 [218/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.925 [219/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.925 [220/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.184 [221/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.185 [222/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.185 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:10.185 [224/265] Linking static target lib/librte_vhost.a 00:02:10.445 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.445 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.390 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.328 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.901 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.438 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.697 [231/265] Linking target lib/librte_eal.so.24.0 00:02:21.697 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:21.697 [233/265] Linking target lib/librte_ring.so.24.0 00:02:21.697 [234/265] Linking target lib/librte_meter.so.24.0 00:02:21.697 [235/265] Linking target lib/librte_timer.so.24.0 00:02:21.697 [236/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:21.697 [237/265] Linking target lib/librte_dmadev.so.24.0 00:02:21.697 [238/265] Linking target lib/librte_pci.so.24.0 00:02:21.956 [239/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:21.956 [240/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:21.956 [241/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:21.956 [242/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:21.956 [243/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:21.956 [244/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:21.956 [245/265] Linking target lib/librte_rcu.so.24.0 00:02:21.956 [246/265] Linking target lib/librte_mempool.so.24.0 00:02:22.215 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:22.215 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:22.215 [249/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:22.215 [250/265] Linking target lib/librte_mbuf.so.24.0 00:02:22.474 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:22.474 [252/265] Linking target lib/librte_compressdev.so.24.0 00:02:22.474 [253/265] Linking target lib/librte_net.so.24.0 00:02:22.474 [254/265] Linking target lib/librte_cryptodev.so.24.0 00:02:22.474 [255/265] Linking target lib/librte_reorder.so.24.0 00:02:22.474 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:22.474 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:22.733 [258/265] Linking target lib/librte_hash.so.24.0 00:02:22.733 [259/265] Linking target lib/librte_security.so.24.0 00:02:22.733 [260/265] Linking target lib/librte_cmdline.so.24.0 00:02:22.733 [261/265] Linking target lib/librte_ethdev.so.24.0 00:02:22.734 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:22.734 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:22.993 [264/265] Linking target lib/librte_vhost.so.24.0 00:02:22.993 [265/265] Linking target lib/librte_power.so.24.0 00:02:22.993 INFO: autodetecting backend as ninja 00:02:22.993 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:23.931 CC lib/log/log.o 00:02:23.931 CC lib/log/log_flags.o 00:02:23.931 CC lib/log/log_deprecated.o 00:02:23.931 CC lib/ut_mock/mock.o 00:02:23.931 CC lib/ut/ut.o 00:02:23.931 LIB libspdk_log.a 00:02:23.931 LIB libspdk_ut_mock.a 00:02:23.931 LIB libspdk_ut.a 00:02:24.191 CXX lib/trace_parser/trace.o 00:02:24.191 CC lib/util/base64.o 00:02:24.191 CC lib/dma/dma.o 00:02:24.191 CC lib/util/bit_array.o 00:02:24.191 CC lib/util/cpuset.o 00:02:24.191 CC lib/util/crc16.o 00:02:24.191 CC lib/util/crc32.o 00:02:24.191 CC lib/ioat/ioat.o 00:02:24.191 CC lib/util/crc32c.o 00:02:24.191 CC lib/util/dif.o 00:02:24.191 CC lib/util/crc32_ieee.o 00:02:24.191 CC lib/util/crc64.o 00:02:24.191 CC lib/util/fd.o 00:02:24.191 CC lib/util/file.o 00:02:24.191 CC lib/util/hexlify.o 00:02:24.191 CC lib/util/pipe.o 00:02:24.191 CC lib/util/iov.o 00:02:24.191 CC lib/util/math.o 00:02:24.191 CC lib/util/strerror_tls.o 00:02:24.191 CC lib/util/string.o 00:02:24.191 CC lib/util/uuid.o 00:02:24.191 CC lib/util/fd_group.o 00:02:24.191 CC lib/util/xor.o 00:02:24.191 CC lib/util/zipf.o 00:02:24.449 CC lib/vfio_user/host/vfio_user_pci.o 00:02:24.449 CC lib/vfio_user/host/vfio_user.o 00:02:24.449 LIB libspdk_dma.a 00:02:24.449 LIB libspdk_ioat.a 00:02:24.449 LIB libspdk_vfio_user.a 00:02:24.707 LIB libspdk_util.a 00:02:24.707 LIB libspdk_trace_parser.a 00:02:24.965 CC lib/json/json_parse.o 00:02:24.965 CC lib/json/json_util.o 00:02:24.965 CC lib/json/json_write.o 00:02:24.965 CC lib/rdma/common.o 00:02:24.965 CC lib/conf/conf.o 00:02:24.965 CC lib/rdma/rdma_verbs.o 00:02:24.965 CC lib/idxd/idxd.o 00:02:24.965 CC lib/idxd/idxd_user.o 00:02:24.965 CC lib/idxd/idxd_kernel.o 00:02:24.965 CC lib/env_dpdk/env.o 00:02:24.965 CC lib/vmd/led.o 00:02:24.965 CC lib/vmd/vmd.o 00:02:24.965 CC lib/env_dpdk/memory.o 00:02:24.965 CC lib/env_dpdk/pci.o 00:02:24.965 CC lib/env_dpdk/pci_ioat.o 00:02:24.965 CC lib/env_dpdk/init.o 00:02:24.965 CC lib/env_dpdk/threads.o 00:02:24.965 CC lib/env_dpdk/pci_virtio.o 00:02:24.965 CC lib/env_dpdk/pci_vmd.o 00:02:24.965 CC lib/env_dpdk/pci_idxd.o 00:02:24.965 CC lib/env_dpdk/pci_event.o 00:02:24.965 CC lib/env_dpdk/sigbus_handler.o 00:02:24.965 CC lib/env_dpdk/pci_dpdk.o 00:02:24.965 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:24.965 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:24.965 LIB libspdk_conf.a 00:02:25.222 LIB libspdk_json.a 00:02:25.222 LIB libspdk_rdma.a 00:02:25.222 LIB libspdk_idxd.a 00:02:25.222 LIB libspdk_vmd.a 00:02:25.480 CC lib/jsonrpc/jsonrpc_server.o 00:02:25.480 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:25.480 CC lib/jsonrpc/jsonrpc_client.o 00:02:25.480 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:25.480 LIB libspdk_jsonrpc.a 00:02:25.738 LIB libspdk_env_dpdk.a 00:02:25.738 CC lib/rpc/rpc.o 00:02:25.997 LIB libspdk_rpc.a 00:02:26.254 CC lib/notify/notify.o 00:02:26.254 CC lib/notify/notify_rpc.o 00:02:26.254 CC lib/trace/trace.o 00:02:26.254 CC lib/trace/trace_rpc.o 00:02:26.254 CC lib/trace/trace_flags.o 00:02:26.254 CC lib/sock/sock.o 00:02:26.254 CC lib/sock/sock_rpc.o 00:02:26.254 LIB libspdk_notify.a 00:02:26.511 LIB libspdk_trace.a 00:02:26.511 LIB libspdk_sock.a 00:02:26.769 CC lib/thread/thread.o 00:02:26.769 CC lib/thread/iobuf.o 00:02:26.769 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:26.769 CC lib/nvme/nvme_ctrlr.o 00:02:26.769 CC lib/nvme/nvme_ns.o 00:02:26.769 CC lib/nvme/nvme_fabric.o 00:02:26.769 CC lib/nvme/nvme_ns_cmd.o 00:02:26.769 CC lib/nvme/nvme_qpair.o 00:02:26.769 CC lib/nvme/nvme_pcie_common.o 00:02:26.769 CC lib/nvme/nvme_pcie.o 00:02:26.769 CC lib/nvme/nvme_quirks.o 00:02:26.769 CC lib/nvme/nvme.o 00:02:26.769 CC lib/nvme/nvme_transport.o 00:02:26.769 CC lib/nvme/nvme_discovery.o 00:02:26.769 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:26.769 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:26.769 CC lib/nvme/nvme_tcp.o 00:02:26.769 CC lib/nvme/nvme_opal.o 00:02:26.769 CC lib/nvme/nvme_io_msg.o 00:02:26.769 CC lib/nvme/nvme_poll_group.o 00:02:26.769 CC lib/nvme/nvme_zns.o 00:02:26.769 CC lib/nvme/nvme_cuse.o 00:02:26.769 CC lib/nvme/nvme_vfio_user.o 00:02:26.769 CC lib/nvme/nvme_rdma.o 00:02:27.708 LIB libspdk_thread.a 00:02:27.708 CC lib/vfu_tgt/tgt_endpoint.o 00:02:27.708 CC lib/vfu_tgt/tgt_rpc.o 00:02:27.708 CC lib/init/json_config.o 00:02:27.708 CC lib/init/subsystem.o 00:02:27.708 CC lib/virtio/virtio.o 00:02:27.708 CC lib/virtio/virtio_pci.o 00:02:27.708 CC lib/init/subsystem_rpc.o 00:02:27.708 CC lib/virtio/virtio_vhost_user.o 00:02:27.708 CC lib/init/rpc.o 00:02:27.708 CC lib/virtio/virtio_vfio_user.o 00:02:27.991 CC lib/blob/request.o 00:02:27.991 CC lib/blob/blobstore.o 00:02:27.991 CC lib/accel/accel.o 00:02:27.991 CC lib/accel/accel_rpc.o 00:02:27.991 CC lib/blob/zeroes.o 00:02:27.991 CC lib/accel/accel_sw.o 00:02:27.991 CC lib/blob/blob_bs_dev.o 00:02:27.991 LIB libspdk_init.a 00:02:27.991 LIB libspdk_virtio.a 00:02:27.991 LIB libspdk_vfu_tgt.a 00:02:27.991 LIB libspdk_nvme.a 00:02:28.249 CC lib/event/app.o 00:02:28.249 CC lib/event/app_rpc.o 00:02:28.249 CC lib/event/reactor.o 00:02:28.249 CC lib/event/log_rpc.o 00:02:28.249 CC lib/event/scheduler_static.o 00:02:28.508 LIB libspdk_accel.a 00:02:28.508 LIB libspdk_event.a 00:02:28.766 CC lib/bdev/bdev.o 00:02:28.766 CC lib/bdev/bdev_rpc.o 00:02:28.766 CC lib/bdev/bdev_zone.o 00:02:28.766 CC lib/bdev/part.o 00:02:28.766 CC lib/bdev/scsi_nvme.o 00:02:29.332 LIB libspdk_blob.a 00:02:29.589 CC lib/lvol/lvol.o 00:02:29.589 CC lib/blobfs/blobfs.o 00:02:29.589 CC lib/blobfs/tree.o 00:02:30.171 LIB libspdk_lvol.a 00:02:30.171 LIB libspdk_blobfs.a 00:02:30.429 LIB libspdk_bdev.a 00:02:30.686 CC lib/scsi/dev.o 00:02:30.686 CC lib/scsi/scsi.o 00:02:30.686 CC lib/scsi/lun.o 00:02:30.686 CC lib/scsi/port.o 00:02:30.686 CC lib/scsi/scsi_bdev.o 00:02:30.686 CC lib/scsi/scsi_pr.o 00:02:30.686 CC lib/scsi/scsi_rpc.o 00:02:30.686 CC lib/scsi/task.o 00:02:30.686 CC lib/ftl/ftl_init.o 00:02:30.686 CC lib/ftl/ftl_core.o 00:02:30.686 CC lib/ftl/ftl_layout.o 00:02:30.686 CC lib/ftl/ftl_debug.o 00:02:30.686 CC lib/ftl/ftl_io.o 00:02:30.686 CC lib/ftl/ftl_l2p_flat.o 00:02:30.686 CC lib/ftl/ftl_sb.o 00:02:30.686 CC lib/ftl/ftl_l2p.o 00:02:30.686 CC lib/ftl/ftl_nv_cache.o 00:02:30.686 CC lib/ftl/ftl_band.o 00:02:30.686 CC lib/ftl/ftl_rq.o 00:02:30.686 CC lib/ftl/ftl_band_ops.o 00:02:30.686 CC lib/ftl/ftl_writer.o 00:02:30.686 CC lib/ftl/ftl_reloc.o 00:02:30.686 CC lib/ftl/ftl_l2p_cache.o 00:02:30.686 CC lib/ftl/ftl_p2l.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:30.686 CC lib/nvmf/ctrlr_bdev.o 00:02:30.686 CC lib/nvmf/ctrlr.o 00:02:30.686 CC lib/nvmf/ctrlr_discovery.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:30.686 CC lib/nbd/nbd.o 00:02:30.686 CC lib/nvmf/subsystem.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:30.686 CC lib/nvmf/nvmf.o 00:02:30.686 CC lib/nbd/nbd_rpc.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:30.686 CC lib/nvmf/nvmf_rpc.o 00:02:30.686 CC lib/nvmf/vfio_user.o 00:02:30.686 CC lib/ublk/ublk.o 00:02:30.686 CC lib/nvmf/transport.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:30.686 CC lib/nvmf/rdma.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:30.686 CC lib/nvmf/tcp.o 00:02:30.686 CC lib/ublk/ublk_rpc.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:30.686 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:30.686 CC lib/ftl/utils/ftl_mempool.o 00:02:30.686 CC lib/ftl/utils/ftl_md.o 00:02:30.686 CC lib/ftl/utils/ftl_conf.o 00:02:30.686 CC lib/ftl/utils/ftl_property.o 00:02:30.686 CC lib/ftl/utils/ftl_bitmap.o 00:02:30.686 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:30.686 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:30.686 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:30.686 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:30.686 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:30.686 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:30.686 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:30.686 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:30.686 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:30.686 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:30.686 CC lib/ftl/base/ftl_base_dev.o 00:02:30.686 CC lib/ftl/base/ftl_base_bdev.o 00:02:30.686 CC lib/ftl/ftl_trace.o 00:02:30.944 LIB libspdk_nbd.a 00:02:30.944 LIB libspdk_scsi.a 00:02:30.944 LIB libspdk_ublk.a 00:02:31.202 LIB libspdk_ftl.a 00:02:31.202 CC lib/vhost/vhost.o 00:02:31.202 CC lib/vhost/vhost_rpc.o 00:02:31.202 CC lib/vhost/vhost_scsi.o 00:02:31.202 CC lib/vhost/vhost_blk.o 00:02:31.202 CC lib/vhost/rte_vhost_user.o 00:02:31.202 CC lib/iscsi/iscsi.o 00:02:31.202 CC lib/iscsi/conn.o 00:02:31.202 CC lib/iscsi/init_grp.o 00:02:31.202 CC lib/iscsi/md5.o 00:02:31.202 CC lib/iscsi/param.o 00:02:31.202 CC lib/iscsi/portal_grp.o 00:02:31.202 CC lib/iscsi/tgt_node.o 00:02:31.202 CC lib/iscsi/iscsi_subsystem.o 00:02:31.202 CC lib/iscsi/iscsi_rpc.o 00:02:31.202 CC lib/iscsi/task.o 00:02:31.771 LIB libspdk_nvmf.a 00:02:31.771 LIB libspdk_vhost.a 00:02:32.029 LIB libspdk_iscsi.a 00:02:32.598 CC module/env_dpdk/env_dpdk_rpc.o 00:02:32.598 CC module/vfu_device/vfu_virtio_scsi.o 00:02:32.598 CC module/vfu_device/vfu_virtio.o 00:02:32.598 CC module/vfu_device/vfu_virtio_blk.o 00:02:32.598 CC module/vfu_device/vfu_virtio_rpc.o 00:02:32.598 CC module/scheduler/gscheduler/gscheduler.o 00:02:32.598 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:32.598 LIB libspdk_env_dpdk_rpc.a 00:02:32.598 CC module/sock/posix/posix.o 00:02:32.598 CC module/blob/bdev/blob_bdev.o 00:02:32.598 CC module/accel/dsa/accel_dsa.o 00:02:32.598 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:32.598 CC module/accel/dsa/accel_dsa_rpc.o 00:02:32.598 CC module/accel/error/accel_error.o 00:02:32.598 CC module/accel/ioat/accel_ioat.o 00:02:32.598 CC module/accel/error/accel_error_rpc.o 00:02:32.598 CC module/accel/ioat/accel_ioat_rpc.o 00:02:32.598 CC module/accel/iaa/accel_iaa.o 00:02:32.598 CC module/accel/iaa/accel_iaa_rpc.o 00:02:32.598 LIB libspdk_scheduler_gscheduler.a 00:02:32.598 LIB libspdk_scheduler_dynamic.a 00:02:32.598 LIB libspdk_scheduler_dpdk_governor.a 00:02:32.857 LIB libspdk_accel_error.a 00:02:32.857 LIB libspdk_accel_ioat.a 00:02:32.857 LIB libspdk_accel_iaa.a 00:02:32.857 LIB libspdk_accel_dsa.a 00:02:32.857 LIB libspdk_blob_bdev.a 00:02:32.857 LIB libspdk_vfu_device.a 00:02:33.116 LIB libspdk_sock_posix.a 00:02:33.116 CC module/bdev/passthru/vbdev_passthru.o 00:02:33.116 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:33.116 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:33.116 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:33.116 CC module/bdev/lvol/vbdev_lvol.o 00:02:33.116 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:33.116 CC module/bdev/null/bdev_null.o 00:02:33.116 CC module/bdev/delay/vbdev_delay.o 00:02:33.116 CC module/bdev/null/bdev_null_rpc.o 00:02:33.116 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:33.116 CC module/bdev/ftl/bdev_ftl.o 00:02:33.116 CC module/bdev/split/vbdev_split.o 00:02:33.116 CC module/blobfs/bdev/blobfs_bdev.o 00:02:33.116 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:33.116 CC module/bdev/split/vbdev_split_rpc.o 00:02:33.116 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:33.116 CC module/bdev/raid/bdev_raid.o 00:02:33.116 CC module/bdev/raid/bdev_raid_rpc.o 00:02:33.116 CC module/bdev/error/vbdev_error_rpc.o 00:02:33.116 CC module/bdev/raid/raid1.o 00:02:33.116 CC module/bdev/raid/bdev_raid_sb.o 00:02:33.116 CC module/bdev/error/vbdev_error.o 00:02:33.116 CC module/bdev/raid/raid0.o 00:02:33.116 CC module/bdev/aio/bdev_aio.o 00:02:33.116 CC module/bdev/aio/bdev_aio_rpc.o 00:02:33.116 CC module/bdev/raid/concat.o 00:02:33.116 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:33.116 CC module/bdev/malloc/bdev_malloc.o 00:02:33.116 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:33.116 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:33.116 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:33.116 CC module/bdev/nvme/bdev_nvme.o 00:02:33.116 CC module/bdev/gpt/gpt.o 00:02:33.116 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:33.116 CC module/bdev/iscsi/bdev_iscsi.o 00:02:33.116 CC module/bdev/nvme/nvme_rpc.o 00:02:33.116 CC module/bdev/gpt/vbdev_gpt.o 00:02:33.116 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:33.116 CC module/bdev/nvme/bdev_mdns_client.o 00:02:33.116 CC module/bdev/nvme/vbdev_opal.o 00:02:33.116 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:33.116 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:33.375 LIB libspdk_blobfs_bdev.a 00:02:33.375 LIB libspdk_bdev_split.a 00:02:33.375 LIB libspdk_bdev_null.a 00:02:33.375 LIB libspdk_bdev_error.a 00:02:33.375 LIB libspdk_bdev_ftl.a 00:02:33.375 LIB libspdk_bdev_gpt.a 00:02:33.375 LIB libspdk_bdev_passthru.a 00:02:33.375 LIB libspdk_bdev_zone_block.a 00:02:33.375 LIB libspdk_bdev_aio.a 00:02:33.375 LIB libspdk_bdev_delay.a 00:02:33.375 LIB libspdk_bdev_iscsi.a 00:02:33.375 LIB libspdk_bdev_malloc.a 00:02:33.634 LIB libspdk_bdev_lvol.a 00:02:33.634 LIB libspdk_bdev_virtio.a 00:02:33.634 LIB libspdk_bdev_raid.a 00:02:34.568 LIB libspdk_bdev_nvme.a 00:02:35.136 CC module/event/subsystems/scheduler/scheduler.o 00:02:35.136 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:35.136 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:35.136 CC module/event/subsystems/iobuf/iobuf.o 00:02:35.136 CC module/event/subsystems/vmd/vmd.o 00:02:35.136 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:35.136 CC module/event/subsystems/sock/sock.o 00:02:35.136 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:35.136 LIB libspdk_event_scheduler.a 00:02:35.136 LIB libspdk_event_vhost_blk.a 00:02:35.136 LIB libspdk_event_vmd.a 00:02:35.136 LIB libspdk_event_iobuf.a 00:02:35.136 LIB libspdk_event_sock.a 00:02:35.137 LIB libspdk_event_vfu_tgt.a 00:02:35.395 CC module/event/subsystems/accel/accel.o 00:02:35.395 LIB libspdk_event_accel.a 00:02:35.961 CC module/event/subsystems/bdev/bdev.o 00:02:35.961 LIB libspdk_event_bdev.a 00:02:36.219 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:36.219 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:36.219 CC module/event/subsystems/scsi/scsi.o 00:02:36.219 CC module/event/subsystems/ublk/ublk.o 00:02:36.219 CC module/event/subsystems/nbd/nbd.o 00:02:36.219 LIB libspdk_event_scsi.a 00:02:36.478 LIB libspdk_event_ublk.a 00:02:36.478 LIB libspdk_event_nbd.a 00:02:36.478 LIB libspdk_event_nvmf.a 00:02:36.737 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:36.737 CC module/event/subsystems/iscsi/iscsi.o 00:02:36.737 LIB libspdk_event_vhost_scsi.a 00:02:36.737 LIB libspdk_event_iscsi.a 00:02:36.996 CC app/spdk_lspci/spdk_lspci.o 00:02:36.996 CC app/spdk_nvme_identify/identify.o 00:02:36.996 CC app/trace_record/trace_record.o 00:02:36.996 CXX app/trace/trace.o 00:02:36.996 CC app/spdk_nvme_perf/perf.o 00:02:36.996 CC app/spdk_nvme_discover/discovery_aer.o 00:02:36.996 TEST_HEADER include/spdk/accel.h 00:02:36.996 TEST_HEADER include/spdk/accel_module.h 00:02:36.996 CC app/spdk_top/spdk_top.o 00:02:36.996 TEST_HEADER include/spdk/barrier.h 00:02:36.996 TEST_HEADER include/spdk/assert.h 00:02:36.996 TEST_HEADER include/spdk/base64.h 00:02:36.996 TEST_HEADER include/spdk/bdev.h 00:02:36.996 TEST_HEADER include/spdk/bdev_zone.h 00:02:36.996 TEST_HEADER include/spdk/bit_array.h 00:02:36.996 TEST_HEADER include/spdk/bdev_module.h 00:02:36.996 TEST_HEADER include/spdk/bit_pool.h 00:02:36.996 TEST_HEADER include/spdk/blob_bdev.h 00:02:36.996 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:36.996 CC test/rpc_client/rpc_client_test.o 00:02:36.996 TEST_HEADER include/spdk/blobfs.h 00:02:36.996 TEST_HEADER include/spdk/blob.h 00:02:36.996 TEST_HEADER include/spdk/conf.h 00:02:36.996 TEST_HEADER include/spdk/cpuset.h 00:02:36.996 TEST_HEADER include/spdk/config.h 00:02:36.996 TEST_HEADER include/spdk/crc16.h 00:02:36.996 TEST_HEADER include/spdk/crc32.h 00:02:36.996 TEST_HEADER include/spdk/crc64.h 00:02:36.996 TEST_HEADER include/spdk/endian.h 00:02:36.996 TEST_HEADER include/spdk/dma.h 00:02:36.996 TEST_HEADER include/spdk/env.h 00:02:36.996 TEST_HEADER include/spdk/env_dpdk.h 00:02:36.996 TEST_HEADER include/spdk/dif.h 00:02:36.996 TEST_HEADER include/spdk/event.h 00:02:36.996 TEST_HEADER include/spdk/fd_group.h 00:02:36.996 TEST_HEADER include/spdk/fd.h 00:02:36.996 TEST_HEADER include/spdk/file.h 00:02:36.996 CC app/spdk_dd/spdk_dd.o 00:02:36.996 TEST_HEADER include/spdk/ftl.h 00:02:36.996 TEST_HEADER include/spdk/gpt_spec.h 00:02:36.996 TEST_HEADER include/spdk/hexlify.h 00:02:36.996 TEST_HEADER include/spdk/histogram_data.h 00:02:36.996 TEST_HEADER include/spdk/idxd.h 00:02:36.996 TEST_HEADER include/spdk/idxd_spec.h 00:02:36.996 TEST_HEADER include/spdk/ioat.h 00:02:36.996 TEST_HEADER include/spdk/init.h 00:02:36.996 TEST_HEADER include/spdk/ioat_spec.h 00:02:36.996 TEST_HEADER include/spdk/iscsi_spec.h 00:02:36.996 TEST_HEADER include/spdk/json.h 00:02:36.996 CC app/iscsi_tgt/iscsi_tgt.o 00:02:36.996 TEST_HEADER include/spdk/jsonrpc.h 00:02:36.996 TEST_HEADER include/spdk/likely.h 00:02:36.996 CC app/nvmf_tgt/nvmf_main.o 00:02:36.996 TEST_HEADER include/spdk/lvol.h 00:02:36.996 TEST_HEADER include/spdk/log.h 00:02:36.996 TEST_HEADER include/spdk/memory.h 00:02:36.996 TEST_HEADER include/spdk/mmio.h 00:02:36.996 TEST_HEADER include/spdk/nbd.h 00:02:36.996 TEST_HEADER include/spdk/nvme.h 00:02:36.996 TEST_HEADER include/spdk/notify.h 00:02:36.996 TEST_HEADER include/spdk/nvme_intel.h 00:02:36.996 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:36.996 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:36.996 TEST_HEADER include/spdk/nvme_spec.h 00:02:36.996 TEST_HEADER include/spdk/nvme_zns.h 00:02:36.996 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:37.260 TEST_HEADER include/spdk/nvmf.h 00:02:37.260 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:37.260 TEST_HEADER include/spdk/nvmf_spec.h 00:02:37.260 TEST_HEADER include/spdk/nvmf_transport.h 00:02:37.260 TEST_HEADER include/spdk/opal_spec.h 00:02:37.260 TEST_HEADER include/spdk/opal.h 00:02:37.260 TEST_HEADER include/spdk/pci_ids.h 00:02:37.260 TEST_HEADER include/spdk/pipe.h 00:02:37.260 TEST_HEADER include/spdk/queue.h 00:02:37.260 TEST_HEADER include/spdk/reduce.h 00:02:37.260 TEST_HEADER include/spdk/rpc.h 00:02:37.260 TEST_HEADER include/spdk/scsi.h 00:02:37.260 TEST_HEADER include/spdk/scheduler.h 00:02:37.260 TEST_HEADER include/spdk/sock.h 00:02:37.261 TEST_HEADER include/spdk/scsi_spec.h 00:02:37.261 TEST_HEADER include/spdk/stdinc.h 00:02:37.261 TEST_HEADER include/spdk/thread.h 00:02:37.261 TEST_HEADER include/spdk/string.h 00:02:37.261 TEST_HEADER include/spdk/trace.h 00:02:37.261 CC app/spdk_tgt/spdk_tgt.o 00:02:37.261 TEST_HEADER include/spdk/trace_parser.h 00:02:37.261 CC app/vhost/vhost.o 00:02:37.261 TEST_HEADER include/spdk/tree.h 00:02:37.261 TEST_HEADER include/spdk/ublk.h 00:02:37.261 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:37.261 TEST_HEADER include/spdk/util.h 00:02:37.261 TEST_HEADER include/spdk/uuid.h 00:02:37.261 TEST_HEADER include/spdk/version.h 00:02:37.261 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:37.261 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:37.261 TEST_HEADER include/spdk/vhost.h 00:02:37.261 TEST_HEADER include/spdk/vmd.h 00:02:37.261 TEST_HEADER include/spdk/xor.h 00:02:37.261 TEST_HEADER include/spdk/zipf.h 00:02:37.261 CXX test/cpp_headers/accel.o 00:02:37.261 CXX test/cpp_headers/accel_module.o 00:02:37.261 CXX test/cpp_headers/assert.o 00:02:37.261 CXX test/cpp_headers/barrier.o 00:02:37.261 CXX test/cpp_headers/base64.o 00:02:37.261 CXX test/cpp_headers/bdev_module.o 00:02:37.261 CXX test/cpp_headers/bdev.o 00:02:37.261 CXX test/cpp_headers/bdev_zone.o 00:02:37.261 CXX test/cpp_headers/bit_array.o 00:02:37.261 CXX test/cpp_headers/blob_bdev.o 00:02:37.261 CXX test/cpp_headers/bit_pool.o 00:02:37.261 CXX test/cpp_headers/blobfs_bdev.o 00:02:37.261 CXX test/cpp_headers/conf.o 00:02:37.261 CXX test/cpp_headers/blobfs.o 00:02:37.261 CXX test/cpp_headers/blob.o 00:02:37.261 CXX test/cpp_headers/config.o 00:02:37.261 CXX test/cpp_headers/cpuset.o 00:02:37.261 CXX test/cpp_headers/crc16.o 00:02:37.261 CXX test/cpp_headers/crc32.o 00:02:37.261 CXX test/cpp_headers/crc64.o 00:02:37.261 CXX test/cpp_headers/dif.o 00:02:37.261 CXX test/cpp_headers/dma.o 00:02:37.261 CXX test/cpp_headers/env_dpdk.o 00:02:37.261 CXX test/cpp_headers/endian.o 00:02:37.261 CXX test/cpp_headers/env.o 00:02:37.261 CXX test/cpp_headers/event.o 00:02:37.261 CXX test/cpp_headers/fd_group.o 00:02:37.261 CXX test/cpp_headers/fd.o 00:02:37.261 CXX test/cpp_headers/file.o 00:02:37.261 CXX test/cpp_headers/gpt_spec.o 00:02:37.261 CXX test/cpp_headers/ftl.o 00:02:37.261 CXX test/cpp_headers/histogram_data.o 00:02:37.261 CXX test/cpp_headers/idxd.o 00:02:37.261 CXX test/cpp_headers/hexlify.o 00:02:37.261 CXX test/cpp_headers/init.o 00:02:37.261 CXX test/cpp_headers/idxd_spec.o 00:02:37.261 LINK spdk_lspci 00:02:37.261 CC examples/nvme/hotplug/hotplug.o 00:02:37.261 CC examples/nvme/reconnect/reconnect.o 00:02:37.261 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:37.261 CC examples/nvme/abort/abort.o 00:02:37.261 CC examples/nvme/hello_world/hello_world.o 00:02:37.261 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:37.261 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:37.261 CC examples/nvme/arbitration/arbitration.o 00:02:37.261 CC test/nvme/e2edp/nvme_dp.o 00:02:37.261 CC test/nvme/reset/reset.o 00:02:37.261 CC test/nvme/aer/aer.o 00:02:37.261 CC examples/ioat/perf/perf.o 00:02:37.261 CXX test/cpp_headers/ioat.o 00:02:37.261 CC test/nvme/err_injection/err_injection.o 00:02:37.261 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:37.261 CC test/nvme/overhead/overhead.o 00:02:37.261 CC examples/vmd/led/led.o 00:02:37.261 CC examples/sock/hello_world/hello_sock.o 00:02:37.261 CC test/app/jsoncat/jsoncat.o 00:02:37.261 CC test/nvme/boot_partition/boot_partition.o 00:02:37.261 CC test/thread/lock/spdk_lock.o 00:02:37.261 CC examples/ioat/verify/verify.o 00:02:37.261 CC test/env/memory/memory_ut.o 00:02:37.261 CC test/nvme/startup/startup.o 00:02:37.261 CC test/env/pci/pci_ut.o 00:02:37.261 CC test/nvme/sgl/sgl.o 00:02:37.261 CC test/nvme/reserve/reserve.o 00:02:37.261 CC test/app/histogram_perf/histogram_perf.o 00:02:37.261 CC test/nvme/cuse/cuse.o 00:02:37.261 CC app/fio/nvme/fio_plugin.o 00:02:37.261 CC test/env/vtophys/vtophys.o 00:02:37.261 CC test/event/reactor_perf/reactor_perf.o 00:02:37.261 CC examples/accel/perf/accel_perf.o 00:02:37.261 CC test/nvme/compliance/nvme_compliance.o 00:02:37.261 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:37.261 CC test/nvme/fdp/fdp.o 00:02:37.261 CC test/nvme/simple_copy/simple_copy.o 00:02:37.261 CC test/nvme/fused_ordering/fused_ordering.o 00:02:37.261 CC test/thread/poller_perf/poller_perf.o 00:02:37.261 CC test/event/event_perf/event_perf.o 00:02:37.261 CC test/nvme/connect_stress/connect_stress.o 00:02:37.261 CC examples/vmd/lsvmd/lsvmd.o 00:02:37.261 CC examples/idxd/perf/perf.o 00:02:37.261 CC test/event/reactor/reactor.o 00:02:37.261 CC test/app/stub/stub.o 00:02:37.261 CC examples/util/zipf/zipf.o 00:02:37.261 CC test/accel/dif/dif.o 00:02:37.261 CC test/event/app_repeat/app_repeat.o 00:02:37.261 CC examples/blob/hello_world/hello_blob.o 00:02:37.261 CC examples/bdev/bdevperf/bdevperf.o 00:02:37.261 CC test/app/bdev_svc/bdev_svc.o 00:02:37.261 CC examples/blob/cli/blobcli.o 00:02:37.261 CC examples/nvmf/nvmf/nvmf.o 00:02:37.261 CC test/dma/test_dma/test_dma.o 00:02:37.261 CC examples/thread/thread/thread_ex.o 00:02:37.261 CC examples/bdev/hello_world/hello_bdev.o 00:02:37.261 CC app/fio/bdev/fio_plugin.o 00:02:37.261 CC test/bdev/bdevio/bdevio.o 00:02:37.261 LINK spdk_nvme_discover 00:02:37.261 CC test/blobfs/mkfs/mkfs.o 00:02:37.261 CC test/event/scheduler/scheduler.o 00:02:37.261 LINK rpc_client_test 00:02:37.261 CC test/env/mem_callbacks/mem_callbacks.o 00:02:37.261 CC test/lvol/esnap/esnap.o 00:02:37.261 LINK spdk_trace_record 00:02:37.261 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:37.261 LINK nvmf_tgt 00:02:37.523 CXX test/cpp_headers/ioat_spec.o 00:02:37.523 CXX test/cpp_headers/iscsi_spec.o 00:02:37.523 CXX test/cpp_headers/json.o 00:02:37.523 CXX test/cpp_headers/jsonrpc.o 00:02:37.523 LINK iscsi_tgt 00:02:37.523 CXX test/cpp_headers/likely.o 00:02:37.523 CXX test/cpp_headers/log.o 00:02:37.523 CXX test/cpp_headers/lvol.o 00:02:37.523 LINK interrupt_tgt 00:02:37.523 CXX test/cpp_headers/memory.o 00:02:37.523 CXX test/cpp_headers/mmio.o 00:02:37.523 CXX test/cpp_headers/nbd.o 00:02:37.523 CXX test/cpp_headers/notify.o 00:02:37.523 CXX test/cpp_headers/nvme.o 00:02:37.523 CXX test/cpp_headers/nvme_intel.o 00:02:37.523 CXX test/cpp_headers/nvme_ocssd.o 00:02:37.523 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:37.523 CXX test/cpp_headers/nvme_spec.o 00:02:37.523 CXX test/cpp_headers/nvme_zns.o 00:02:37.523 CXX test/cpp_headers/nvmf_cmd.o 00:02:37.523 LINK vhost 00:02:37.523 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:37.523 CXX test/cpp_headers/nvmf.o 00:02:37.523 LINK jsoncat 00:02:37.523 CXX test/cpp_headers/nvmf_spec.o 00:02:37.523 CXX test/cpp_headers/nvmf_transport.o 00:02:37.523 CXX test/cpp_headers/opal.o 00:02:37.523 LINK led 00:02:37.523 CXX test/cpp_headers/opal_spec.o 00:02:37.523 LINK lsvmd 00:02:37.523 CXX test/cpp_headers/pci_ids.o 00:02:37.523 CXX test/cpp_headers/pipe.o 00:02:37.523 LINK vtophys 00:02:37.523 LINK env_dpdk_post_init 00:02:37.523 LINK event_perf 00:02:37.523 LINK histogram_perf 00:02:37.523 CXX test/cpp_headers/queue.o 00:02:37.523 LINK reactor_perf 00:02:37.523 CXX test/cpp_headers/reduce.o 00:02:37.523 LINK reactor 00:02:37.523 LINK poller_perf 00:02:37.523 CXX test/cpp_headers/rpc.o 00:02:37.523 LINK pmr_persistence 00:02:37.523 CXX test/cpp_headers/scheduler.o 00:02:37.523 LINK spdk_tgt 00:02:37.523 CXX test/cpp_headers/scsi.o 00:02:37.523 LINK zipf 00:02:37.523 CXX test/cpp_headers/scsi_spec.o 00:02:37.523 CXX test/cpp_headers/sock.o 00:02:37.523 LINK startup 00:02:37.523 LINK app_repeat 00:02:37.523 LINK err_injection 00:02:37.523 LINK boot_partition 00:02:37.523 LINK doorbell_aers 00:02:37.523 LINK cmb_copy 00:02:37.523 LINK stub 00:02:37.523 LINK connect_stress 00:02:37.523 LINK reserve 00:02:37.523 LINK hotplug 00:02:37.523 LINK hello_world 00:02:37.523 LINK ioat_perf 00:02:37.523 CXX test/cpp_headers/stdinc.o 00:02:37.523 LINK fused_ordering 00:02:37.523 LINK verify 00:02:37.523 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:37.523 LINK simple_copy 00:02:37.523 LINK bdev_svc 00:02:37.523 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:37.523 LINK hello_sock 00:02:37.523 LINK nvme_dp 00:02:37.523 LINK mkfs 00:02:37.523 LINK reset 00:02:37.523 LINK hello_blob 00:02:37.523 LINK aer 00:02:37.523 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:37.523 LINK fdp 00:02:37.523 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:37.523 LINK overhead 00:02:37.523 LINK sgl 00:02:37.523 LINK thread 00:02:37.523 LINK scheduler 00:02:37.523 CXX test/cpp_headers/string.o 00:02:37.523 LINK hello_bdev 00:02:37.523 CXX test/cpp_headers/thread.o 00:02:37.523 CXX test/cpp_headers/trace.o 00:02:37.782 CXX test/cpp_headers/trace_parser.o 00:02:37.782 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:37.782 CXX test/cpp_headers/tree.o 00:02:37.782 CXX test/cpp_headers/ublk.o 00:02:37.782 CXX test/cpp_headers/util.o 00:02:37.782 CXX test/cpp_headers/uuid.o 00:02:37.782 CXX test/cpp_headers/version.o 00:02:37.782 LINK nvmf 00:02:37.782 CXX test/cpp_headers/vfio_user_pci.o 00:02:37.782 CXX test/cpp_headers/vfio_user_spec.o 00:02:37.782 CXX test/cpp_headers/vhost.o 00:02:37.782 CXX test/cpp_headers/vmd.o 00:02:37.782 CXX test/cpp_headers/xor.o 00:02:37.782 CXX test/cpp_headers/zipf.o 00:02:37.782 LINK reconnect 00:02:37.782 LINK spdk_trace 00:02:37.782 LINK idxd_perf 00:02:37.782 LINK arbitration 00:02:37.782 LINK spdk_dd 00:02:37.782 LINK abort 00:02:37.782 LINK dif 00:02:37.782 LINK test_dma 00:02:37.782 LINK nvme_manage 00:02:37.782 LINK accel_perf 00:02:37.782 LINK pci_ut 00:02:37.782 LINK bdevio 00:02:37.782 LINK nvme_compliance 00:02:37.782 LINK blobcli 00:02:38.040 LINK spdk_nvme 00:02:38.040 LINK nvme_fuzz 00:02:38.040 LINK llvm_vfio_fuzz 00:02:38.040 LINK vhost_fuzz 00:02:38.040 LINK mem_callbacks 00:02:38.040 LINK spdk_bdev 00:02:38.040 LINK spdk_nvme_identify 00:02:38.040 LINK spdk_nvme_perf 00:02:38.298 LINK cuse 00:02:38.298 LINK bdevperf 00:02:38.298 LINK memory_ut 00:02:38.298 LINK spdk_top 00:02:38.298 LINK llvm_nvme_fuzz 00:02:38.864 LINK spdk_lock 00:02:38.864 LINK iscsi_fuzz 00:02:40.780 LINK esnap 00:02:41.040 00:02:41.040 real 0m41.222s 00:02:41.040 user 5m43.612s 00:02:41.040 sys 2m53.039s 00:02:41.040 00:01:06 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:41.040 00:01:06 -- common/autotest_common.sh@10 -- $ set +x 00:02:41.040 ************************************ 00:02:41.040 END TEST make 00:02:41.040 ************************************ 00:02:41.300 00:01:06 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:41.300 00:01:06 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:41.300 00:01:06 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:41.300 00:01:06 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:41.300 00:01:06 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:41.300 00:01:06 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:41.300 00:01:06 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:41.300 00:01:06 -- scripts/common.sh@335 -- # IFS=.-: 00:02:41.300 00:01:06 -- scripts/common.sh@335 -- # read -ra ver1 00:02:41.300 00:01:06 -- scripts/common.sh@336 -- # IFS=.-: 00:02:41.300 00:01:06 -- scripts/common.sh@336 -- # read -ra ver2 00:02:41.300 00:01:06 -- scripts/common.sh@337 -- # local 'op=<' 00:02:41.300 00:01:06 -- scripts/common.sh@339 -- # ver1_l=2 00:02:41.300 00:01:06 -- scripts/common.sh@340 -- # ver2_l=1 00:02:41.300 00:01:06 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:41.300 00:01:06 -- scripts/common.sh@343 -- # case "$op" in 00:02:41.300 00:01:06 -- scripts/common.sh@344 -- # : 1 00:02:41.300 00:01:06 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:41.300 00:01:06 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:41.300 00:01:06 -- scripts/common.sh@364 -- # decimal 1 00:02:41.300 00:01:06 -- scripts/common.sh@352 -- # local d=1 00:02:41.300 00:01:06 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:41.300 00:01:06 -- scripts/common.sh@354 -- # echo 1 00:02:41.300 00:01:06 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:41.300 00:01:06 -- scripts/common.sh@365 -- # decimal 2 00:02:41.300 00:01:06 -- scripts/common.sh@352 -- # local d=2 00:02:41.300 00:01:06 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:41.300 00:01:06 -- scripts/common.sh@354 -- # echo 2 00:02:41.300 00:01:06 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:41.300 00:01:06 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:41.300 00:01:06 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:41.300 00:01:06 -- scripts/common.sh@367 -- # return 0 00:02:41.300 00:01:06 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:41.300 00:01:06 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:41.300 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:41.300 --rc genhtml_branch_coverage=1 00:02:41.300 --rc genhtml_function_coverage=1 00:02:41.300 --rc genhtml_legend=1 00:02:41.300 --rc geninfo_all_blocks=1 00:02:41.300 --rc geninfo_unexecuted_blocks=1 00:02:41.300 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:41.300 ' 00:02:41.300 00:01:06 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:41.300 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:41.300 --rc genhtml_branch_coverage=1 00:02:41.300 --rc genhtml_function_coverage=1 00:02:41.300 --rc genhtml_legend=1 00:02:41.300 --rc geninfo_all_blocks=1 00:02:41.300 --rc geninfo_unexecuted_blocks=1 00:02:41.300 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:41.300 ' 00:02:41.300 00:01:06 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:41.300 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:41.300 --rc genhtml_branch_coverage=1 00:02:41.300 --rc genhtml_function_coverage=1 00:02:41.300 --rc genhtml_legend=1 00:02:41.300 --rc geninfo_all_blocks=1 00:02:41.300 --rc geninfo_unexecuted_blocks=1 00:02:41.300 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:41.300 ' 00:02:41.300 00:01:06 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:41.300 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:41.300 --rc genhtml_branch_coverage=1 00:02:41.300 --rc genhtml_function_coverage=1 00:02:41.300 --rc genhtml_legend=1 00:02:41.300 --rc geninfo_all_blocks=1 00:02:41.300 --rc geninfo_unexecuted_blocks=1 00:02:41.300 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:41.300 ' 00:02:41.300 00:01:06 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:41.300 00:01:06 -- nvmf/common.sh@7 -- # uname -s 00:02:41.300 00:01:06 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:41.300 00:01:06 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:41.300 00:01:06 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:41.300 00:01:06 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:41.300 00:01:06 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:41.300 00:01:06 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:41.300 00:01:06 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:41.300 00:01:06 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:41.300 00:01:06 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:41.300 00:01:06 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:41.300 00:01:06 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:41.300 00:01:06 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:41.300 00:01:06 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:41.300 00:01:06 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:41.300 00:01:06 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:41.300 00:01:06 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:41.300 00:01:06 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:41.300 00:01:06 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:41.300 00:01:06 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:41.300 00:01:06 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.300 00:01:06 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.301 00:01:06 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.301 00:01:06 -- paths/export.sh@5 -- # export PATH 00:02:41.301 00:01:06 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.301 00:01:06 -- nvmf/common.sh@46 -- # : 0 00:02:41.301 00:01:06 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:41.301 00:01:06 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:41.301 00:01:06 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:41.301 00:01:06 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:41.301 00:01:06 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:41.301 00:01:06 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:41.301 00:01:06 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:41.301 00:01:06 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:41.301 00:01:06 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:41.301 00:01:06 -- spdk/autotest.sh@32 -- # uname -s 00:02:41.301 00:01:06 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:41.301 00:01:06 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:41.301 00:01:06 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:41.301 00:01:06 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:41.301 00:01:06 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:41.301 00:01:06 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:41.301 00:01:06 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:41.301 00:01:06 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:41.301 00:01:06 -- spdk/autotest.sh@48 -- # udevadm_pid=2647420 00:02:41.301 00:01:06 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:41.301 00:01:06 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:41.301 00:01:06 -- spdk/autotest.sh@54 -- # echo 2647422 00:02:41.301 00:01:06 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:41.301 00:01:06 -- spdk/autotest.sh@56 -- # echo 2647423 00:02:41.301 00:01:06 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:41.301 00:01:06 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:41.301 00:01:06 -- spdk/autotest.sh@60 -- # echo 2647424 00:02:41.301 00:01:06 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:41.301 00:01:06 -- spdk/autotest.sh@62 -- # echo 2647425 00:02:41.301 00:01:06 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:41.301 00:01:06 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:41.301 00:01:06 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:41.301 00:01:06 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:41.301 00:01:06 -- common/autotest_common.sh@10 -- # set +x 00:02:41.301 00:01:06 -- spdk/autotest.sh@70 -- # create_test_list 00:02:41.301 00:01:06 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:41.301 00:01:06 -- common/autotest_common.sh@10 -- # set +x 00:02:41.301 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:41.559 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:41.559 00:01:06 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:41.559 00:01:06 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:41.559 00:01:06 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:41.559 00:01:06 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:41.559 00:01:06 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:41.559 00:01:06 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:41.559 00:01:06 -- common/autotest_common.sh@1450 -- # uname 00:02:41.559 00:01:06 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:02:41.559 00:01:06 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:41.559 00:01:06 -- common/autotest_common.sh@1470 -- # uname 00:02:41.559 00:01:06 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:02:41.559 00:01:06 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:02:41.559 00:01:06 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:02:41.559 lcov: LCOV version 1.15 00:02:41.559 00:01:06 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:02:43.465 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:02:43.465 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:02:43.465 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:02:55.678 00:01:21 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:02:55.678 00:01:21 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:55.678 00:01:21 -- common/autotest_common.sh@10 -- # set +x 00:02:55.678 00:01:21 -- spdk/autotest.sh@89 -- # rm -f 00:02:55.678 00:01:21 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:58.983 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:58.983 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:58.983 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:58.983 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:58.983 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:59.242 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:59.501 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:59.501 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:02:59.501 00:01:24 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:02:59.501 00:01:24 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:59.501 00:01:24 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:59.501 00:01:24 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:59.501 00:01:24 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:59.501 00:01:24 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:59.501 00:01:24 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:59.501 00:01:24 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:59.501 00:01:24 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:59.501 00:01:24 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:02:59.501 00:01:24 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:02:59.501 00:01:24 -- spdk/autotest.sh@108 -- # grep -v p 00:02:59.501 00:01:24 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:02:59.501 00:01:24 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:02:59.501 00:01:24 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:02:59.501 00:01:24 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:02:59.501 00:01:24 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:59.501 No valid GPT data, bailing 00:02:59.501 00:01:24 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:59.501 00:01:24 -- scripts/common.sh@393 -- # pt= 00:02:59.501 00:01:24 -- scripts/common.sh@394 -- # return 1 00:02:59.501 00:01:24 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:59.501 1+0 records in 00:02:59.501 1+0 records out 00:02:59.501 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00573207 s, 183 MB/s 00:02:59.501 00:01:24 -- spdk/autotest.sh@116 -- # sync 00:02:59.501 00:01:24 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:59.501 00:01:24 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:59.501 00:01:24 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:07.633 00:01:31 -- spdk/autotest.sh@122 -- # uname -s 00:03:07.633 00:01:31 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:07.633 00:01:31 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:07.633 00:01:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:07.633 00:01:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:07.633 00:01:31 -- common/autotest_common.sh@10 -- # set +x 00:03:07.633 ************************************ 00:03:07.633 START TEST setup.sh 00:03:07.633 ************************************ 00:03:07.633 00:01:31 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:07.633 * Looking for test storage... 00:03:07.633 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:07.633 00:01:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:07.633 00:01:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:07.633 00:01:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:07.633 00:01:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:07.633 00:01:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:07.633 00:01:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:07.633 00:01:32 -- scripts/common.sh@335 -- # IFS=.-: 00:03:07.633 00:01:32 -- scripts/common.sh@335 -- # read -ra ver1 00:03:07.633 00:01:32 -- scripts/common.sh@336 -- # IFS=.-: 00:03:07.633 00:01:32 -- scripts/common.sh@336 -- # read -ra ver2 00:03:07.633 00:01:32 -- scripts/common.sh@337 -- # local 'op=<' 00:03:07.633 00:01:32 -- scripts/common.sh@339 -- # ver1_l=2 00:03:07.633 00:01:32 -- scripts/common.sh@340 -- # ver2_l=1 00:03:07.633 00:01:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:07.633 00:01:32 -- scripts/common.sh@343 -- # case "$op" in 00:03:07.633 00:01:32 -- scripts/common.sh@344 -- # : 1 00:03:07.633 00:01:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:07.633 00:01:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:07.633 00:01:32 -- scripts/common.sh@364 -- # decimal 1 00:03:07.633 00:01:32 -- scripts/common.sh@352 -- # local d=1 00:03:07.633 00:01:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:07.633 00:01:32 -- scripts/common.sh@354 -- # echo 1 00:03:07.633 00:01:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:07.633 00:01:32 -- scripts/common.sh@365 -- # decimal 2 00:03:07.633 00:01:32 -- scripts/common.sh@352 -- # local d=2 00:03:07.633 00:01:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:07.633 00:01:32 -- scripts/common.sh@354 -- # echo 2 00:03:07.633 00:01:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:07.633 00:01:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:07.633 00:01:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:07.633 00:01:32 -- scripts/common.sh@367 -- # return 0 00:03:07.633 00:01:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:07.633 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.633 --rc genhtml_branch_coverage=1 00:03:07.633 --rc genhtml_function_coverage=1 00:03:07.633 --rc genhtml_legend=1 00:03:07.633 --rc geninfo_all_blocks=1 00:03:07.633 --rc geninfo_unexecuted_blocks=1 00:03:07.633 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.633 ' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:07.633 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.633 --rc genhtml_branch_coverage=1 00:03:07.633 --rc genhtml_function_coverage=1 00:03:07.633 --rc genhtml_legend=1 00:03:07.633 --rc geninfo_all_blocks=1 00:03:07.633 --rc geninfo_unexecuted_blocks=1 00:03:07.633 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.633 ' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:07.633 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.633 --rc genhtml_branch_coverage=1 00:03:07.633 --rc genhtml_function_coverage=1 00:03:07.633 --rc genhtml_legend=1 00:03:07.633 --rc geninfo_all_blocks=1 00:03:07.633 --rc geninfo_unexecuted_blocks=1 00:03:07.633 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.633 ' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:07.633 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.633 --rc genhtml_branch_coverage=1 00:03:07.633 --rc genhtml_function_coverage=1 00:03:07.633 --rc genhtml_legend=1 00:03:07.633 --rc geninfo_all_blocks=1 00:03:07.633 --rc geninfo_unexecuted_blocks=1 00:03:07.633 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.633 ' 00:03:07.633 00:01:32 -- setup/test-setup.sh@10 -- # uname -s 00:03:07.633 00:01:32 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:07.633 00:01:32 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:07.633 00:01:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:07.633 00:01:32 -- common/autotest_common.sh@10 -- # set +x 00:03:07.633 ************************************ 00:03:07.633 START TEST acl 00:03:07.633 ************************************ 00:03:07.633 00:01:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:07.633 * Looking for test storage... 00:03:07.633 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:07.633 00:01:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:07.633 00:01:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:07.633 00:01:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:07.633 00:01:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:07.633 00:01:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:07.633 00:01:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:07.633 00:01:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:07.633 00:01:32 -- scripts/common.sh@335 -- # IFS=.-: 00:03:07.633 00:01:32 -- scripts/common.sh@335 -- # read -ra ver1 00:03:07.633 00:01:32 -- scripts/common.sh@336 -- # IFS=.-: 00:03:07.633 00:01:32 -- scripts/common.sh@336 -- # read -ra ver2 00:03:07.633 00:01:32 -- scripts/common.sh@337 -- # local 'op=<' 00:03:07.633 00:01:32 -- scripts/common.sh@339 -- # ver1_l=2 00:03:07.633 00:01:32 -- scripts/common.sh@340 -- # ver2_l=1 00:03:07.633 00:01:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:07.633 00:01:32 -- scripts/common.sh@343 -- # case "$op" in 00:03:07.633 00:01:32 -- scripts/common.sh@344 -- # : 1 00:03:07.633 00:01:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:07.633 00:01:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:07.633 00:01:32 -- scripts/common.sh@364 -- # decimal 1 00:03:07.633 00:01:32 -- scripts/common.sh@352 -- # local d=1 00:03:07.633 00:01:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:07.633 00:01:32 -- scripts/common.sh@354 -- # echo 1 00:03:07.634 00:01:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:07.634 00:01:32 -- scripts/common.sh@365 -- # decimal 2 00:03:07.634 00:01:32 -- scripts/common.sh@352 -- # local d=2 00:03:07.634 00:01:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:07.634 00:01:32 -- scripts/common.sh@354 -- # echo 2 00:03:07.634 00:01:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:07.634 00:01:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:07.634 00:01:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:07.634 00:01:32 -- scripts/common.sh@367 -- # return 0 00:03:07.634 00:01:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:07.634 00:01:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:07.634 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.634 --rc genhtml_branch_coverage=1 00:03:07.634 --rc genhtml_function_coverage=1 00:03:07.634 --rc genhtml_legend=1 00:03:07.634 --rc geninfo_all_blocks=1 00:03:07.634 --rc geninfo_unexecuted_blocks=1 00:03:07.634 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.634 ' 00:03:07.634 00:01:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:07.634 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.634 --rc genhtml_branch_coverage=1 00:03:07.634 --rc genhtml_function_coverage=1 00:03:07.634 --rc genhtml_legend=1 00:03:07.634 --rc geninfo_all_blocks=1 00:03:07.634 --rc geninfo_unexecuted_blocks=1 00:03:07.634 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.634 ' 00:03:07.634 00:01:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:07.634 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.634 --rc genhtml_branch_coverage=1 00:03:07.634 --rc genhtml_function_coverage=1 00:03:07.634 --rc genhtml_legend=1 00:03:07.634 --rc geninfo_all_blocks=1 00:03:07.634 --rc geninfo_unexecuted_blocks=1 00:03:07.634 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.634 ' 00:03:07.634 00:01:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:07.634 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:07.634 --rc genhtml_branch_coverage=1 00:03:07.634 --rc genhtml_function_coverage=1 00:03:07.634 --rc genhtml_legend=1 00:03:07.634 --rc geninfo_all_blocks=1 00:03:07.634 --rc geninfo_unexecuted_blocks=1 00:03:07.634 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:07.634 ' 00:03:07.634 00:01:32 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:07.634 00:01:32 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:07.634 00:01:32 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:07.634 00:01:32 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:07.634 00:01:32 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:07.634 00:01:32 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:07.634 00:01:32 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:07.634 00:01:32 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:07.634 00:01:32 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:07.634 00:01:32 -- setup/acl.sh@12 -- # devs=() 00:03:07.634 00:01:32 -- setup/acl.sh@12 -- # declare -a devs 00:03:07.634 00:01:32 -- setup/acl.sh@13 -- # drivers=() 00:03:07.634 00:01:32 -- setup/acl.sh@13 -- # declare -A drivers 00:03:07.634 00:01:32 -- setup/acl.sh@51 -- # setup reset 00:03:07.634 00:01:32 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:07.634 00:01:32 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:11.017 00:01:35 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:11.017 00:01:35 -- setup/acl.sh@16 -- # local dev driver 00:03:11.017 00:01:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:11.017 00:01:35 -- setup/acl.sh@15 -- # setup output status 00:03:11.017 00:01:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:11.017 00:01:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:13.553 Hugepages 00:03:13.553 node hugesize free / total 00:03:13.553 00:01:38 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:13.553 00:01:38 -- setup/acl.sh@19 -- # continue 00:03:13.553 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.553 00:01:38 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:13.553 00:01:38 -- setup/acl.sh@19 -- # continue 00:03:13.553 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.553 00:01:38 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:13.553 00:01:38 -- setup/acl.sh@19 -- # continue 00:03:13.553 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.553 00:03:13.554 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # continue 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:13.554 00:01:38 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:13.554 00:01:38 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:13.554 00:01:38 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:13.554 00:01:38 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.554 00:01:38 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:13.554 00:01:38 -- setup/acl.sh@54 -- # run_test denied denied 00:03:13.554 00:01:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:13.554 00:01:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:13.554 00:01:38 -- common/autotest_common.sh@10 -- # set +x 00:03:13.554 ************************************ 00:03:13.554 START TEST denied 00:03:13.554 ************************************ 00:03:13.554 00:01:38 -- common/autotest_common.sh@1114 -- # denied 00:03:13.554 00:01:38 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:13.554 00:01:38 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:13.554 00:01:38 -- setup/acl.sh@38 -- # setup output config 00:03:13.554 00:01:38 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:13.554 00:01:38 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:17.762 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:17.762 00:01:42 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:17.762 00:01:42 -- setup/acl.sh@28 -- # local dev driver 00:03:17.762 00:01:42 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:17.762 00:01:42 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:17.762 00:01:42 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:17.762 00:01:42 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:17.762 00:01:42 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:17.762 00:01:42 -- setup/acl.sh@41 -- # setup reset 00:03:17.762 00:01:42 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:17.762 00:01:42 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:21.981 00:03:21.981 real 0m8.303s 00:03:21.981 user 0m2.612s 00:03:21.981 sys 0m5.083s 00:03:21.981 00:01:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:21.981 00:01:47 -- common/autotest_common.sh@10 -- # set +x 00:03:21.981 ************************************ 00:03:21.981 END TEST denied 00:03:21.981 ************************************ 00:03:21.981 00:01:47 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:21.981 00:01:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:21.981 00:01:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:21.981 00:01:47 -- common/autotest_common.sh@10 -- # set +x 00:03:21.981 ************************************ 00:03:21.981 START TEST allowed 00:03:21.981 ************************************ 00:03:21.981 00:01:47 -- common/autotest_common.sh@1114 -- # allowed 00:03:21.981 00:01:47 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:21.981 00:01:47 -- setup/acl.sh@45 -- # setup output config 00:03:21.981 00:01:47 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:21.981 00:01:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:21.981 00:01:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:27.261 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:27.261 00:01:52 -- setup/acl.sh@47 -- # verify 00:03:27.261 00:01:52 -- setup/acl.sh@28 -- # local dev driver 00:03:27.261 00:01:52 -- setup/acl.sh@48 -- # setup reset 00:03:27.261 00:01:52 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:27.261 00:01:52 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:30.564 00:03:30.564 real 0m8.482s 00:03:30.564 user 0m2.237s 00:03:30.564 sys 0m4.774s 00:03:30.564 00:01:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:30.564 00:01:55 -- common/autotest_common.sh@10 -- # set +x 00:03:30.564 ************************************ 00:03:30.564 END TEST allowed 00:03:30.564 ************************************ 00:03:30.564 00:03:30.564 real 0m23.660s 00:03:30.564 user 0m7.174s 00:03:30.564 sys 0m14.578s 00:03:30.564 00:01:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:30.564 00:01:55 -- common/autotest_common.sh@10 -- # set +x 00:03:30.564 ************************************ 00:03:30.564 END TEST acl 00:03:30.564 ************************************ 00:03:30.564 00:01:55 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:30.564 00:01:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:30.564 00:01:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:30.564 00:01:55 -- common/autotest_common.sh@10 -- # set +x 00:03:30.564 ************************************ 00:03:30.564 START TEST hugepages 00:03:30.564 ************************************ 00:03:30.564 00:01:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:30.564 * Looking for test storage... 00:03:30.564 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:30.564 00:01:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:30.564 00:01:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:30.564 00:01:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:30.564 00:01:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:30.564 00:01:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:30.564 00:01:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:30.564 00:01:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:30.564 00:01:56 -- scripts/common.sh@335 -- # IFS=.-: 00:03:30.564 00:01:56 -- scripts/common.sh@335 -- # read -ra ver1 00:03:30.564 00:01:56 -- scripts/common.sh@336 -- # IFS=.-: 00:03:30.564 00:01:56 -- scripts/common.sh@336 -- # read -ra ver2 00:03:30.564 00:01:56 -- scripts/common.sh@337 -- # local 'op=<' 00:03:30.564 00:01:56 -- scripts/common.sh@339 -- # ver1_l=2 00:03:30.564 00:01:56 -- scripts/common.sh@340 -- # ver2_l=1 00:03:30.565 00:01:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:30.565 00:01:56 -- scripts/common.sh@343 -- # case "$op" in 00:03:30.565 00:01:56 -- scripts/common.sh@344 -- # : 1 00:03:30.565 00:01:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:30.565 00:01:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:30.565 00:01:56 -- scripts/common.sh@364 -- # decimal 1 00:03:30.565 00:01:56 -- scripts/common.sh@352 -- # local d=1 00:03:30.565 00:01:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:30.565 00:01:56 -- scripts/common.sh@354 -- # echo 1 00:03:30.565 00:01:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:30.565 00:01:56 -- scripts/common.sh@365 -- # decimal 2 00:03:30.565 00:01:56 -- scripts/common.sh@352 -- # local d=2 00:03:30.565 00:01:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:30.565 00:01:56 -- scripts/common.sh@354 -- # echo 2 00:03:30.565 00:01:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:30.565 00:01:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:30.565 00:01:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:30.565 00:01:56 -- scripts/common.sh@367 -- # return 0 00:03:30.565 00:01:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:30.565 00:01:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:30.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:30.565 --rc genhtml_branch_coverage=1 00:03:30.565 --rc genhtml_function_coverage=1 00:03:30.565 --rc genhtml_legend=1 00:03:30.565 --rc geninfo_all_blocks=1 00:03:30.565 --rc geninfo_unexecuted_blocks=1 00:03:30.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:30.565 ' 00:03:30.565 00:01:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:30.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:30.565 --rc genhtml_branch_coverage=1 00:03:30.565 --rc genhtml_function_coverage=1 00:03:30.565 --rc genhtml_legend=1 00:03:30.565 --rc geninfo_all_blocks=1 00:03:30.565 --rc geninfo_unexecuted_blocks=1 00:03:30.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:30.565 ' 00:03:30.565 00:01:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:30.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:30.565 --rc genhtml_branch_coverage=1 00:03:30.565 --rc genhtml_function_coverage=1 00:03:30.565 --rc genhtml_legend=1 00:03:30.565 --rc geninfo_all_blocks=1 00:03:30.565 --rc geninfo_unexecuted_blocks=1 00:03:30.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:30.565 ' 00:03:30.565 00:01:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:30.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:30.565 --rc genhtml_branch_coverage=1 00:03:30.565 --rc genhtml_function_coverage=1 00:03:30.565 --rc genhtml_legend=1 00:03:30.565 --rc geninfo_all_blocks=1 00:03:30.565 --rc geninfo_unexecuted_blocks=1 00:03:30.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:30.565 ' 00:03:30.565 00:01:56 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:30.565 00:01:56 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:30.565 00:01:56 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:30.565 00:01:56 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:30.565 00:01:56 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:30.565 00:01:56 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:30.565 00:01:56 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:30.565 00:01:56 -- setup/common.sh@18 -- # local node= 00:03:30.565 00:01:56 -- setup/common.sh@19 -- # local var val 00:03:30.565 00:01:56 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.565 00:01:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.565 00:01:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.565 00:01:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.565 00:01:56 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.565 00:01:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41389636 kB' 'MemAvailable: 43026252 kB' 'Buffers: 6816 kB' 'Cached: 9254900 kB' 'SwapCached: 248 kB' 'Active: 6702168 kB' 'Inactive: 3151864 kB' 'Active(anon): 5794488 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595340 kB' 'Mapped: 127172 kB' 'Shmem: 7512812 kB' 'KReclaimable: 584512 kB' 'Slab: 1591552 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1007040 kB' 'KernelStack: 21856 kB' 'PageTables: 8360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 10050100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217924 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.565 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.565 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # continue 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.566 00:01:56 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.566 00:01:56 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:30.566 00:01:56 -- setup/common.sh@33 -- # echo 2048 00:03:30.566 00:01:56 -- setup/common.sh@33 -- # return 0 00:03:30.566 00:01:56 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:30.566 00:01:56 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:30.566 00:01:56 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:30.566 00:01:56 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:30.566 00:01:56 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:30.566 00:01:56 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:30.566 00:01:56 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:30.566 00:01:56 -- setup/hugepages.sh@207 -- # get_nodes 00:03:30.566 00:01:56 -- setup/hugepages.sh@27 -- # local node 00:03:30.566 00:01:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:30.566 00:01:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:30.566 00:01:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:30.566 00:01:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:30.566 00:01:56 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:30.566 00:01:56 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:30.566 00:01:56 -- setup/hugepages.sh@208 -- # clear_hp 00:03:30.566 00:01:56 -- setup/hugepages.sh@37 -- # local node hp 00:03:30.566 00:01:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:30.566 00:01:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:30.566 00:01:56 -- setup/hugepages.sh@41 -- # echo 0 00:03:30.566 00:01:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:30.566 00:01:56 -- setup/hugepages.sh@41 -- # echo 0 00:03:30.566 00:01:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:30.566 00:01:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:30.566 00:01:56 -- setup/hugepages.sh@41 -- # echo 0 00:03:30.567 00:01:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:30.567 00:01:56 -- setup/hugepages.sh@41 -- # echo 0 00:03:30.567 00:01:56 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:30.567 00:01:56 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:30.567 00:01:56 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:30.567 00:01:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:30.567 00:01:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:30.567 00:01:56 -- common/autotest_common.sh@10 -- # set +x 00:03:30.567 ************************************ 00:03:30.567 START TEST default_setup 00:03:30.567 ************************************ 00:03:30.567 00:01:56 -- common/autotest_common.sh@1114 -- # default_setup 00:03:30.567 00:01:56 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:30.567 00:01:56 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:30.567 00:01:56 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:30.567 00:01:56 -- setup/hugepages.sh@51 -- # shift 00:03:30.567 00:01:56 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:30.567 00:01:56 -- setup/hugepages.sh@52 -- # local node_ids 00:03:30.567 00:01:56 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:30.567 00:01:56 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:30.567 00:01:56 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:30.567 00:01:56 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:30.567 00:01:56 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:30.567 00:01:56 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:30.567 00:01:56 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:30.567 00:01:56 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:30.567 00:01:56 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:30.567 00:01:56 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:30.567 00:01:56 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:30.567 00:01:56 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:30.567 00:01:56 -- setup/hugepages.sh@73 -- # return 0 00:03:30.567 00:01:56 -- setup/hugepages.sh@137 -- # setup output 00:03:30.567 00:01:56 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:30.567 00:01:56 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:33.857 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:33.857 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:33.857 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:33.858 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:34.117 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:35.496 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:35.759 00:02:01 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:35.759 00:02:01 -- setup/hugepages.sh@89 -- # local node 00:03:35.759 00:02:01 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:35.759 00:02:01 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:35.759 00:02:01 -- setup/hugepages.sh@92 -- # local surp 00:03:35.759 00:02:01 -- setup/hugepages.sh@93 -- # local resv 00:03:35.759 00:02:01 -- setup/hugepages.sh@94 -- # local anon 00:03:35.759 00:02:01 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:35.759 00:02:01 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:35.759 00:02:01 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:35.759 00:02:01 -- setup/common.sh@18 -- # local node= 00:03:35.759 00:02:01 -- setup/common.sh@19 -- # local var val 00:03:35.759 00:02:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:35.759 00:02:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.759 00:02:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.759 00:02:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.759 00:02:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.759 00:02:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43560752 kB' 'MemAvailable: 45197368 kB' 'Buffers: 6816 kB' 'Cached: 9255040 kB' 'SwapCached: 248 kB' 'Active: 6704644 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796964 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598044 kB' 'Mapped: 127244 kB' 'Shmem: 7512952 kB' 'KReclaimable: 584512 kB' 'Slab: 1589932 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005420 kB' 'KernelStack: 22032 kB' 'PageTables: 9052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10053876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217972 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.759 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.759 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.760 00:02:01 -- setup/common.sh@33 -- # echo 0 00:03:35.760 00:02:01 -- setup/common.sh@33 -- # return 0 00:03:35.760 00:02:01 -- setup/hugepages.sh@97 -- # anon=0 00:03:35.760 00:02:01 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:35.760 00:02:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:35.760 00:02:01 -- setup/common.sh@18 -- # local node= 00:03:35.760 00:02:01 -- setup/common.sh@19 -- # local var val 00:03:35.760 00:02:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:35.760 00:02:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.760 00:02:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.760 00:02:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.760 00:02:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.760 00:02:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43563104 kB' 'MemAvailable: 45199720 kB' 'Buffers: 6816 kB' 'Cached: 9255040 kB' 'SwapCached: 248 kB' 'Active: 6704980 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797300 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598268 kB' 'Mapped: 127332 kB' 'Shmem: 7512952 kB' 'KReclaimable: 584512 kB' 'Slab: 1589956 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005444 kB' 'KernelStack: 22144 kB' 'PageTables: 8992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10053888 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.760 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.760 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.761 00:02:01 -- setup/common.sh@33 -- # echo 0 00:03:35.761 00:02:01 -- setup/common.sh@33 -- # return 0 00:03:35.761 00:02:01 -- setup/hugepages.sh@99 -- # surp=0 00:03:35.761 00:02:01 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:35.761 00:02:01 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:35.761 00:02:01 -- setup/common.sh@18 -- # local node= 00:03:35.761 00:02:01 -- setup/common.sh@19 -- # local var val 00:03:35.761 00:02:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:35.761 00:02:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.761 00:02:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.761 00:02:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.761 00:02:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.761 00:02:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43563584 kB' 'MemAvailable: 45200200 kB' 'Buffers: 6816 kB' 'Cached: 9255040 kB' 'SwapCached: 248 kB' 'Active: 6704128 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796448 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597360 kB' 'Mapped: 127224 kB' 'Shmem: 7512952 kB' 'KReclaimable: 584512 kB' 'Slab: 1589916 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005404 kB' 'KernelStack: 22080 kB' 'PageTables: 8900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10052388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.761 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.761 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.762 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.762 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.763 00:02:01 -- setup/common.sh@33 -- # echo 0 00:03:35.763 00:02:01 -- setup/common.sh@33 -- # return 0 00:03:35.763 00:02:01 -- setup/hugepages.sh@100 -- # resv=0 00:03:35.763 00:02:01 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:35.763 nr_hugepages=1024 00:03:35.763 00:02:01 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:35.763 resv_hugepages=0 00:03:35.763 00:02:01 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:35.763 surplus_hugepages=0 00:03:35.763 00:02:01 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:35.763 anon_hugepages=0 00:03:35.763 00:02:01 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:35.763 00:02:01 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:35.763 00:02:01 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:35.763 00:02:01 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:35.763 00:02:01 -- setup/common.sh@18 -- # local node= 00:03:35.763 00:02:01 -- setup/common.sh@19 -- # local var val 00:03:35.763 00:02:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:35.763 00:02:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.763 00:02:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.763 00:02:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.763 00:02:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.763 00:02:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43561784 kB' 'MemAvailable: 45198400 kB' 'Buffers: 6816 kB' 'Cached: 9255040 kB' 'SwapCached: 248 kB' 'Active: 6704464 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796784 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597664 kB' 'Mapped: 127224 kB' 'Shmem: 7512952 kB' 'KReclaimable: 584512 kB' 'Slab: 1589916 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005404 kB' 'KernelStack: 22176 kB' 'PageTables: 8924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10053916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.763 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.763 00:02:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.764 00:02:01 -- setup/common.sh@33 -- # echo 1024 00:03:35.764 00:02:01 -- setup/common.sh@33 -- # return 0 00:03:35.764 00:02:01 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:35.764 00:02:01 -- setup/hugepages.sh@112 -- # get_nodes 00:03:35.764 00:02:01 -- setup/hugepages.sh@27 -- # local node 00:03:35.764 00:02:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:35.764 00:02:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:35.764 00:02:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:35.764 00:02:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:35.764 00:02:01 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:35.764 00:02:01 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:35.764 00:02:01 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:35.764 00:02:01 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:35.764 00:02:01 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:35.764 00:02:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:35.764 00:02:01 -- setup/common.sh@18 -- # local node=0 00:03:35.764 00:02:01 -- setup/common.sh@19 -- # local var val 00:03:35.764 00:02:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:35.764 00:02:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.764 00:02:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:35.764 00:02:01 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:35.764 00:02:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.764 00:02:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23222288 kB' 'MemUsed: 9412148 kB' 'SwapCached: 148 kB' 'Active: 4400308 kB' 'Inactive: 535724 kB' 'Active(anon): 3622544 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673052 kB' 'Mapped: 70000 kB' 'AnonPages: 266312 kB' 'Shmem: 3359936 kB' 'KernelStack: 10808 kB' 'PageTables: 4580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 886616 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 485228 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.764 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.764 00:02:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # continue 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.765 00:02:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.765 00:02:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.765 00:02:01 -- setup/common.sh@33 -- # echo 0 00:03:35.765 00:02:01 -- setup/common.sh@33 -- # return 0 00:03:35.765 00:02:01 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:35.765 00:02:01 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:35.765 00:02:01 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:35.765 00:02:01 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:35.765 00:02:01 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:35.765 node0=1024 expecting 1024 00:03:35.765 00:02:01 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:35.765 00:03:35.765 real 0m5.121s 00:03:35.765 user 0m1.387s 00:03:35.765 sys 0m2.367s 00:03:35.765 00:02:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:35.765 00:02:01 -- common/autotest_common.sh@10 -- # set +x 00:03:35.765 ************************************ 00:03:35.765 END TEST default_setup 00:03:35.765 ************************************ 00:03:35.765 00:02:01 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:35.765 00:02:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:35.765 00:02:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:35.765 00:02:01 -- common/autotest_common.sh@10 -- # set +x 00:03:35.765 ************************************ 00:03:35.765 START TEST per_node_1G_alloc 00:03:35.765 ************************************ 00:03:35.765 00:02:01 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:35.765 00:02:01 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:35.766 00:02:01 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:35.766 00:02:01 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:35.766 00:02:01 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:35.766 00:02:01 -- setup/hugepages.sh@51 -- # shift 00:03:35.766 00:02:01 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:35.766 00:02:01 -- setup/hugepages.sh@52 -- # local node_ids 00:03:35.766 00:02:01 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:35.766 00:02:01 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:35.766 00:02:01 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:35.766 00:02:01 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:35.766 00:02:01 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:35.766 00:02:01 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:35.766 00:02:01 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:35.766 00:02:01 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:35.766 00:02:01 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:35.766 00:02:01 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:35.766 00:02:01 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:35.766 00:02:01 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:35.766 00:02:01 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:35.766 00:02:01 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:35.766 00:02:01 -- setup/hugepages.sh@73 -- # return 0 00:03:35.766 00:02:01 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:35.766 00:02:01 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:35.766 00:02:01 -- setup/hugepages.sh@146 -- # setup output 00:03:35.766 00:02:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:35.766 00:02:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:39.077 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.077 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:39.077 00:02:04 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:39.077 00:02:04 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:39.077 00:02:04 -- setup/hugepages.sh@89 -- # local node 00:03:39.077 00:02:04 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.077 00:02:04 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.077 00:02:04 -- setup/hugepages.sh@92 -- # local surp 00:03:39.077 00:02:04 -- setup/hugepages.sh@93 -- # local resv 00:03:39.077 00:02:04 -- setup/hugepages.sh@94 -- # local anon 00:03:39.077 00:02:04 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.077 00:02:04 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.077 00:02:04 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.077 00:02:04 -- setup/common.sh@18 -- # local node= 00:03:39.078 00:02:04 -- setup/common.sh@19 -- # local var val 00:03:39.078 00:02:04 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.078 00:02:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.078 00:02:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.078 00:02:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.078 00:02:04 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.078 00:02:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43590492 kB' 'MemAvailable: 45227108 kB' 'Buffers: 6816 kB' 'Cached: 9255152 kB' 'SwapCached: 248 kB' 'Active: 6704524 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796844 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597472 kB' 'Mapped: 127268 kB' 'Shmem: 7513064 kB' 'KReclaimable: 584512 kB' 'Slab: 1590104 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005592 kB' 'KernelStack: 22000 kB' 'PageTables: 8888 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10054760 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.078 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.078 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.079 00:02:04 -- setup/common.sh@33 -- # echo 0 00:03:39.079 00:02:04 -- setup/common.sh@33 -- # return 0 00:03:39.079 00:02:04 -- setup/hugepages.sh@97 -- # anon=0 00:03:39.079 00:02:04 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.079 00:02:04 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.079 00:02:04 -- setup/common.sh@18 -- # local node= 00:03:39.079 00:02:04 -- setup/common.sh@19 -- # local var val 00:03:39.079 00:02:04 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.079 00:02:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.079 00:02:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.079 00:02:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.079 00:02:04 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.079 00:02:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.079 00:02:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43589416 kB' 'MemAvailable: 45226032 kB' 'Buffers: 6816 kB' 'Cached: 9255156 kB' 'SwapCached: 248 kB' 'Active: 6705152 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797472 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598208 kB' 'Mapped: 127228 kB' 'Shmem: 7513068 kB' 'KReclaimable: 584512 kB' 'Slab: 1590136 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005624 kB' 'KernelStack: 22112 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10054772 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.079 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.079 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.080 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.080 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.081 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.081 00:02:04 -- setup/common.sh@33 -- # echo 0 00:03:39.081 00:02:04 -- setup/common.sh@33 -- # return 0 00:03:39.081 00:02:04 -- setup/hugepages.sh@99 -- # surp=0 00:03:39.081 00:02:04 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.081 00:02:04 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.081 00:02:04 -- setup/common.sh@18 -- # local node= 00:03:39.081 00:02:04 -- setup/common.sh@19 -- # local var val 00:03:39.081 00:02:04 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.081 00:02:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.081 00:02:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.081 00:02:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.081 00:02:04 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.081 00:02:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.081 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43589308 kB' 'MemAvailable: 45225924 kB' 'Buffers: 6816 kB' 'Cached: 9255168 kB' 'SwapCached: 248 kB' 'Active: 6704692 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797012 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597644 kB' 'Mapped: 127228 kB' 'Shmem: 7513080 kB' 'KReclaimable: 584512 kB' 'Slab: 1590136 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005624 kB' 'KernelStack: 22080 kB' 'PageTables: 8732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10054788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.082 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.082 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.083 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.083 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.084 00:02:04 -- setup/common.sh@33 -- # echo 0 00:03:39.084 00:02:04 -- setup/common.sh@33 -- # return 0 00:03:39.084 00:02:04 -- setup/hugepages.sh@100 -- # resv=0 00:03:39.084 00:02:04 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.084 nr_hugepages=1024 00:03:39.084 00:02:04 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.084 resv_hugepages=0 00:03:39.084 00:02:04 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.084 surplus_hugepages=0 00:03:39.084 00:02:04 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.084 anon_hugepages=0 00:03:39.084 00:02:04 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.084 00:02:04 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.084 00:02:04 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.084 00:02:04 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.084 00:02:04 -- setup/common.sh@18 -- # local node= 00:03:39.084 00:02:04 -- setup/common.sh@19 -- # local var val 00:03:39.084 00:02:04 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.084 00:02:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.084 00:02:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.084 00:02:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.084 00:02:04 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.084 00:02:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43589728 kB' 'MemAvailable: 45226344 kB' 'Buffers: 6816 kB' 'Cached: 9255168 kB' 'SwapCached: 248 kB' 'Active: 6705344 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797664 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598256 kB' 'Mapped: 127228 kB' 'Shmem: 7513080 kB' 'KReclaimable: 584512 kB' 'Slab: 1590136 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005624 kB' 'KernelStack: 22032 kB' 'PageTables: 8708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10054552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.084 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.084 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.085 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.085 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.086 00:02:04 -- setup/common.sh@33 -- # echo 1024 00:03:39.086 00:02:04 -- setup/common.sh@33 -- # return 0 00:03:39.086 00:02:04 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.086 00:02:04 -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.086 00:02:04 -- setup/hugepages.sh@27 -- # local node 00:03:39.086 00:02:04 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.086 00:02:04 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:39.086 00:02:04 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.086 00:02:04 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:39.086 00:02:04 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.086 00:02:04 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.086 00:02:04 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.086 00:02:04 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.086 00:02:04 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.086 00:02:04 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.086 00:02:04 -- setup/common.sh@18 -- # local node=0 00:03:39.086 00:02:04 -- setup/common.sh@19 -- # local var val 00:03:39.086 00:02:04 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.086 00:02:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.086 00:02:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.086 00:02:04 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.086 00:02:04 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.086 00:02:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24278880 kB' 'MemUsed: 8355556 kB' 'SwapCached: 148 kB' 'Active: 4399664 kB' 'Inactive: 535724 kB' 'Active(anon): 3621900 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673124 kB' 'Mapped: 70004 kB' 'AnonPages: 265400 kB' 'Shmem: 3360008 kB' 'KernelStack: 10744 kB' 'PageTables: 4384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 886572 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 485184 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.086 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.086 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.087 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.087 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@33 -- # echo 0 00:03:39.088 00:02:04 -- setup/common.sh@33 -- # return 0 00:03:39.088 00:02:04 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.088 00:02:04 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.088 00:02:04 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.088 00:02:04 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:39.088 00:02:04 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.088 00:02:04 -- setup/common.sh@18 -- # local node=1 00:03:39.088 00:02:04 -- setup/common.sh@19 -- # local var val 00:03:39.088 00:02:04 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.088 00:02:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.088 00:02:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:39.088 00:02:04 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:39.088 00:02:04 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.088 00:02:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19310152 kB' 'MemUsed: 8339208 kB' 'SwapCached: 100 kB' 'Active: 2305444 kB' 'Inactive: 2616140 kB' 'Active(anon): 2175528 kB' 'Inactive(anon): 2310120 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4589136 kB' 'Mapped: 57224 kB' 'AnonPages: 332572 kB' 'Shmem: 4153100 kB' 'KernelStack: 11288 kB' 'PageTables: 4300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183124 kB' 'Slab: 703568 kB' 'SReclaimable: 183124 kB' 'SUnreclaim: 520444 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.088 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.088 00:02:04 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # continue 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.089 00:02:04 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.089 00:02:04 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.089 00:02:04 -- setup/common.sh@33 -- # echo 0 00:03:39.089 00:02:04 -- setup/common.sh@33 -- # return 0 00:03:39.089 00:02:04 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.089 00:02:04 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.089 00:02:04 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.089 00:02:04 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:39.089 node0=512 expecting 512 00:03:39.089 00:02:04 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.089 00:02:04 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.089 00:02:04 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.089 00:02:04 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:39.089 node1=512 expecting 512 00:03:39.089 00:02:04 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:39.089 00:03:39.089 real 0m3.185s 00:03:39.089 user 0m1.139s 00:03:39.089 sys 0m1.999s 00:03:39.089 00:02:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:39.089 00:02:04 -- common/autotest_common.sh@10 -- # set +x 00:03:39.089 ************************************ 00:03:39.089 END TEST per_node_1G_alloc 00:03:39.089 ************************************ 00:03:39.089 00:02:04 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:39.089 00:02:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:39.089 00:02:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:39.089 00:02:04 -- common/autotest_common.sh@10 -- # set +x 00:03:39.089 ************************************ 00:03:39.089 START TEST even_2G_alloc 00:03:39.089 ************************************ 00:03:39.089 00:02:04 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:03:39.089 00:02:04 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:39.089 00:02:04 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:39.089 00:02:04 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:39.089 00:02:04 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:39.089 00:02:04 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:39.089 00:02:04 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:39.089 00:02:04 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:39.089 00:02:04 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:39.089 00:02:04 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:39.089 00:02:04 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:39.089 00:02:04 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:39.089 00:02:04 -- setup/hugepages.sh@83 -- # : 512 00:03:39.089 00:02:04 -- setup/hugepages.sh@84 -- # : 1 00:03:39.089 00:02:04 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:39.089 00:02:04 -- setup/hugepages.sh@83 -- # : 0 00:03:39.089 00:02:04 -- setup/hugepages.sh@84 -- # : 0 00:03:39.089 00:02:04 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.089 00:02:04 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:39.089 00:02:04 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:39.089 00:02:04 -- setup/hugepages.sh@153 -- # setup output 00:03:39.089 00:02:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.089 00:02:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:42.386 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.386 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:42.386 00:02:07 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:42.386 00:02:07 -- setup/hugepages.sh@89 -- # local node 00:03:42.386 00:02:07 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:42.386 00:02:07 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:42.386 00:02:07 -- setup/hugepages.sh@92 -- # local surp 00:03:42.386 00:02:07 -- setup/hugepages.sh@93 -- # local resv 00:03:42.386 00:02:07 -- setup/hugepages.sh@94 -- # local anon 00:03:42.386 00:02:07 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:42.386 00:02:07 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:42.386 00:02:07 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:42.386 00:02:07 -- setup/common.sh@18 -- # local node= 00:03:42.386 00:02:07 -- setup/common.sh@19 -- # local var val 00:03:42.386 00:02:07 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.386 00:02:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.386 00:02:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.386 00:02:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.386 00:02:07 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.386 00:02:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43607732 kB' 'MemAvailable: 45244348 kB' 'Buffers: 6816 kB' 'Cached: 9255592 kB' 'SwapCached: 248 kB' 'Active: 6704696 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797016 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597176 kB' 'Mapped: 126008 kB' 'Shmem: 7513504 kB' 'KReclaimable: 584512 kB' 'Slab: 1590440 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005928 kB' 'KernelStack: 21904 kB' 'PageTables: 8300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10043592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.386 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.386 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.387 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.387 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.650 00:02:07 -- setup/common.sh@33 -- # echo 0 00:03:42.650 00:02:07 -- setup/common.sh@33 -- # return 0 00:03:42.650 00:02:07 -- setup/hugepages.sh@97 -- # anon=0 00:03:42.650 00:02:07 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:42.650 00:02:07 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.650 00:02:07 -- setup/common.sh@18 -- # local node= 00:03:42.650 00:02:07 -- setup/common.sh@19 -- # local var val 00:03:42.650 00:02:07 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.650 00:02:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.650 00:02:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.650 00:02:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.650 00:02:07 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.650 00:02:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43611052 kB' 'MemAvailable: 45247668 kB' 'Buffers: 6816 kB' 'Cached: 9255596 kB' 'SwapCached: 248 kB' 'Active: 6704892 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797212 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597432 kB' 'Mapped: 126072 kB' 'Shmem: 7513508 kB' 'KReclaimable: 584512 kB' 'Slab: 1590464 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005952 kB' 'KernelStack: 21904 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10043604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.650 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.650 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.651 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.651 00:02:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.652 00:02:07 -- setup/common.sh@33 -- # echo 0 00:03:42.652 00:02:07 -- setup/common.sh@33 -- # return 0 00:03:42.652 00:02:07 -- setup/hugepages.sh@99 -- # surp=0 00:03:42.652 00:02:07 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:42.652 00:02:07 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:42.652 00:02:07 -- setup/common.sh@18 -- # local node= 00:03:42.652 00:02:07 -- setup/common.sh@19 -- # local var val 00:03:42.652 00:02:07 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.652 00:02:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.652 00:02:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.652 00:02:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.652 00:02:07 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.652 00:02:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43611172 kB' 'MemAvailable: 45247788 kB' 'Buffers: 6816 kB' 'Cached: 9255680 kB' 'SwapCached: 248 kB' 'Active: 6704952 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797272 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597432 kB' 'Mapped: 126072 kB' 'Shmem: 7513592 kB' 'KReclaimable: 584512 kB' 'Slab: 1590464 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005952 kB' 'KernelStack: 21904 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10043692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.652 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.652 00:02:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.653 00:02:08 -- setup/common.sh@33 -- # echo 0 00:03:42.653 00:02:08 -- setup/common.sh@33 -- # return 0 00:03:42.653 00:02:08 -- setup/hugepages.sh@100 -- # resv=0 00:03:42.653 00:02:08 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:42.653 nr_hugepages=1024 00:03:42.653 00:02:08 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:42.653 resv_hugepages=0 00:03:42.653 00:02:08 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:42.653 surplus_hugepages=0 00:03:42.653 00:02:08 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:42.653 anon_hugepages=0 00:03:42.653 00:02:08 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:42.653 00:02:08 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:42.653 00:02:08 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:42.653 00:02:08 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:42.653 00:02:08 -- setup/common.sh@18 -- # local node= 00:03:42.653 00:02:08 -- setup/common.sh@19 -- # local var val 00:03:42.653 00:02:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.653 00:02:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.653 00:02:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.653 00:02:08 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.653 00:02:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.653 00:02:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.653 00:02:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43611504 kB' 'MemAvailable: 45248120 kB' 'Buffers: 6816 kB' 'Cached: 9255704 kB' 'SwapCached: 248 kB' 'Active: 6704640 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796960 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597060 kB' 'Mapped: 126072 kB' 'Shmem: 7513616 kB' 'KReclaimable: 584512 kB' 'Slab: 1590464 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005952 kB' 'KernelStack: 21888 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10043708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.653 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.653 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.654 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.654 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.654 00:02:08 -- setup/common.sh@33 -- # echo 1024 00:03:42.654 00:02:08 -- setup/common.sh@33 -- # return 0 00:03:42.654 00:02:08 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:42.654 00:02:08 -- setup/hugepages.sh@112 -- # get_nodes 00:03:42.654 00:02:08 -- setup/hugepages.sh@27 -- # local node 00:03:42.654 00:02:08 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.654 00:02:08 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.654 00:02:08 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.655 00:02:08 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.655 00:02:08 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:42.655 00:02:08 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:42.655 00:02:08 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.655 00:02:08 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.655 00:02:08 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:42.655 00:02:08 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.655 00:02:08 -- setup/common.sh@18 -- # local node=0 00:03:42.655 00:02:08 -- setup/common.sh@19 -- # local var val 00:03:42.655 00:02:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.655 00:02:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.655 00:02:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:42.655 00:02:08 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:42.655 00:02:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.655 00:02:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24279668 kB' 'MemUsed: 8354768 kB' 'SwapCached: 148 kB' 'Active: 4400740 kB' 'Inactive: 535724 kB' 'Active(anon): 3622976 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673584 kB' 'Mapped: 69196 kB' 'AnonPages: 265504 kB' 'Shmem: 3360468 kB' 'KernelStack: 10728 kB' 'PageTables: 4336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 886716 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 485328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.655 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.655 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@33 -- # echo 0 00:03:42.656 00:02:08 -- setup/common.sh@33 -- # return 0 00:03:42.656 00:02:08 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:42.656 00:02:08 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.656 00:02:08 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.656 00:02:08 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:42.656 00:02:08 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.656 00:02:08 -- setup/common.sh@18 -- # local node=1 00:03:42.656 00:02:08 -- setup/common.sh@19 -- # local var val 00:03:42.656 00:02:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.656 00:02:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.656 00:02:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:42.656 00:02:08 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:42.656 00:02:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.656 00:02:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19332040 kB' 'MemUsed: 8317320 kB' 'SwapCached: 100 kB' 'Active: 2304312 kB' 'Inactive: 2616140 kB' 'Active(anon): 2174396 kB' 'Inactive(anon): 2310120 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4589184 kB' 'Mapped: 56876 kB' 'AnonPages: 331468 kB' 'Shmem: 4153148 kB' 'KernelStack: 11160 kB' 'PageTables: 3948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183124 kB' 'Slab: 703748 kB' 'SReclaimable: 183124 kB' 'SUnreclaim: 520624 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.656 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.656 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # continue 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.657 00:02:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.657 00:02:08 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.657 00:02:08 -- setup/common.sh@33 -- # echo 0 00:03:42.657 00:02:08 -- setup/common.sh@33 -- # return 0 00:03:42.657 00:02:08 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:42.657 00:02:08 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:42.657 00:02:08 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:42.657 00:02:08 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:42.657 node0=512 expecting 512 00:03:42.657 00:02:08 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:42.657 00:02:08 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:42.657 00:02:08 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:42.657 00:02:08 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:42.657 node1=512 expecting 512 00:03:42.657 00:02:08 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:42.657 00:03:42.657 real 0m3.603s 00:03:42.657 user 0m1.371s 00:03:42.657 sys 0m2.301s 00:03:42.657 00:02:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:42.657 00:02:08 -- common/autotest_common.sh@10 -- # set +x 00:03:42.657 ************************************ 00:03:42.657 END TEST even_2G_alloc 00:03:42.657 ************************************ 00:03:42.657 00:02:08 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:42.657 00:02:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:42.657 00:02:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:42.657 00:02:08 -- common/autotest_common.sh@10 -- # set +x 00:03:42.657 ************************************ 00:03:42.657 START TEST odd_alloc 00:03:42.657 ************************************ 00:03:42.657 00:02:08 -- common/autotest_common.sh@1114 -- # odd_alloc 00:03:42.657 00:02:08 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:42.657 00:02:08 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:42.657 00:02:08 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:42.657 00:02:08 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:42.657 00:02:08 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:42.657 00:02:08 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:42.657 00:02:08 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:42.657 00:02:08 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:42.657 00:02:08 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:42.657 00:02:08 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:42.657 00:02:08 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:42.657 00:02:08 -- setup/hugepages.sh@83 -- # : 513 00:03:42.657 00:02:08 -- setup/hugepages.sh@84 -- # : 1 00:03:42.657 00:02:08 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:42.657 00:02:08 -- setup/hugepages.sh@83 -- # : 0 00:03:42.657 00:02:08 -- setup/hugepages.sh@84 -- # : 0 00:03:42.657 00:02:08 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.657 00:02:08 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:42.657 00:02:08 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:42.657 00:02:08 -- setup/hugepages.sh@160 -- # setup output 00:03:42.657 00:02:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:42.657 00:02:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:45.953 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.953 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:45.953 00:02:11 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:45.953 00:02:11 -- setup/hugepages.sh@89 -- # local node 00:03:45.953 00:02:11 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:45.953 00:02:11 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:45.953 00:02:11 -- setup/hugepages.sh@92 -- # local surp 00:03:45.953 00:02:11 -- setup/hugepages.sh@93 -- # local resv 00:03:45.953 00:02:11 -- setup/hugepages.sh@94 -- # local anon 00:03:45.953 00:02:11 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:45.953 00:02:11 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:45.953 00:02:11 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:45.953 00:02:11 -- setup/common.sh@18 -- # local node= 00:03:45.953 00:02:11 -- setup/common.sh@19 -- # local var val 00:03:45.953 00:02:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.953 00:02:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.953 00:02:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.953 00:02:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.953 00:02:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.953 00:02:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43655420 kB' 'MemAvailable: 45292036 kB' 'Buffers: 6816 kB' 'Cached: 9255900 kB' 'SwapCached: 248 kB' 'Active: 6703972 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796292 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595600 kB' 'Mapped: 126112 kB' 'Shmem: 7513812 kB' 'KReclaimable: 584512 kB' 'Slab: 1589432 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004920 kB' 'KernelStack: 21888 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10044424 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.953 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.953 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.954 00:02:11 -- setup/common.sh@33 -- # echo 0 00:03:45.954 00:02:11 -- setup/common.sh@33 -- # return 0 00:03:45.954 00:02:11 -- setup/hugepages.sh@97 -- # anon=0 00:03:45.954 00:02:11 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:45.954 00:02:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.954 00:02:11 -- setup/common.sh@18 -- # local node= 00:03:45.954 00:02:11 -- setup/common.sh@19 -- # local var val 00:03:45.954 00:02:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.954 00:02:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.954 00:02:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.954 00:02:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.954 00:02:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.954 00:02:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43657124 kB' 'MemAvailable: 45293740 kB' 'Buffers: 6816 kB' 'Cached: 9255908 kB' 'SwapCached: 248 kB' 'Active: 6704156 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796476 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595844 kB' 'Mapped: 126156 kB' 'Shmem: 7513820 kB' 'KReclaimable: 584512 kB' 'Slab: 1589468 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004956 kB' 'KernelStack: 21856 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10044436 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.954 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.954 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.955 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.955 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.956 00:02:11 -- setup/common.sh@33 -- # echo 0 00:03:45.956 00:02:11 -- setup/common.sh@33 -- # return 0 00:03:45.956 00:02:11 -- setup/hugepages.sh@99 -- # surp=0 00:03:45.956 00:02:11 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:45.956 00:02:11 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:45.956 00:02:11 -- setup/common.sh@18 -- # local node= 00:03:45.956 00:02:11 -- setup/common.sh@19 -- # local var val 00:03:45.956 00:02:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.956 00:02:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.956 00:02:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.956 00:02:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.956 00:02:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.956 00:02:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.956 00:02:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43656756 kB' 'MemAvailable: 45293372 kB' 'Buffers: 6816 kB' 'Cached: 9255920 kB' 'SwapCached: 248 kB' 'Active: 6703792 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796112 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595956 kB' 'Mapped: 126080 kB' 'Shmem: 7513832 kB' 'KReclaimable: 584512 kB' 'Slab: 1589436 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004924 kB' 'KernelStack: 21888 kB' 'PageTables: 8328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10044448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.956 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.956 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.957 00:02:11 -- setup/common.sh@33 -- # echo 0 00:03:45.957 00:02:11 -- setup/common.sh@33 -- # return 0 00:03:45.957 00:02:11 -- setup/hugepages.sh@100 -- # resv=0 00:03:45.957 00:02:11 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:45.957 nr_hugepages=1025 00:03:45.957 00:02:11 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:45.957 resv_hugepages=0 00:03:45.957 00:02:11 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:45.957 surplus_hugepages=0 00:03:45.957 00:02:11 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:45.957 anon_hugepages=0 00:03:45.957 00:02:11 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:45.957 00:02:11 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:45.957 00:02:11 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:45.957 00:02:11 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:45.957 00:02:11 -- setup/common.sh@18 -- # local node= 00:03:45.957 00:02:11 -- setup/common.sh@19 -- # local var val 00:03:45.957 00:02:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.957 00:02:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.957 00:02:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.957 00:02:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.957 00:02:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.957 00:02:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.957 00:02:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43656756 kB' 'MemAvailable: 45293372 kB' 'Buffers: 6816 kB' 'Cached: 9255936 kB' 'SwapCached: 248 kB' 'Active: 6703388 kB' 'Inactive: 3151864 kB' 'Active(anon): 5795708 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595520 kB' 'Mapped: 126080 kB' 'Shmem: 7513848 kB' 'KReclaimable: 584512 kB' 'Slab: 1589436 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004924 kB' 'KernelStack: 21872 kB' 'PageTables: 8280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10044464 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.957 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.957 00:02:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.958 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.958 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.959 00:02:11 -- setup/common.sh@33 -- # echo 1025 00:03:45.959 00:02:11 -- setup/common.sh@33 -- # return 0 00:03:45.959 00:02:11 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:45.959 00:02:11 -- setup/hugepages.sh@112 -- # get_nodes 00:03:45.959 00:02:11 -- setup/hugepages.sh@27 -- # local node 00:03:45.959 00:02:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.959 00:02:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:45.959 00:02:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.959 00:02:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:45.959 00:02:11 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:45.959 00:02:11 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:45.959 00:02:11 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.959 00:02:11 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.959 00:02:11 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:45.959 00:02:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.959 00:02:11 -- setup/common.sh@18 -- # local node=0 00:03:45.959 00:02:11 -- setup/common.sh@19 -- # local var val 00:03:45.959 00:02:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.959 00:02:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.959 00:02:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:45.959 00:02:11 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:45.959 00:02:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.959 00:02:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24307440 kB' 'MemUsed: 8326996 kB' 'SwapCached: 148 kB' 'Active: 4398848 kB' 'Inactive: 535724 kB' 'Active(anon): 3621084 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673784 kB' 'Mapped: 69204 kB' 'AnonPages: 263992 kB' 'Shmem: 3360668 kB' 'KernelStack: 10744 kB' 'PageTables: 4380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 886036 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 484648 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.959 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.959 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@33 -- # echo 0 00:03:45.960 00:02:11 -- setup/common.sh@33 -- # return 0 00:03:45.960 00:02:11 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.960 00:02:11 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.960 00:02:11 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.960 00:02:11 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:45.960 00:02:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.960 00:02:11 -- setup/common.sh@18 -- # local node=1 00:03:45.960 00:02:11 -- setup/common.sh@19 -- # local var val 00:03:45.960 00:02:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.960 00:02:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.960 00:02:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:45.960 00:02:11 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:45.960 00:02:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.960 00:02:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19349668 kB' 'MemUsed: 8299692 kB' 'SwapCached: 100 kB' 'Active: 2304584 kB' 'Inactive: 2616140 kB' 'Active(anon): 2174668 kB' 'Inactive(anon): 2310120 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4589228 kB' 'Mapped: 56876 kB' 'AnonPages: 331560 kB' 'Shmem: 4153192 kB' 'KernelStack: 11128 kB' 'PageTables: 3900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183124 kB' 'Slab: 703400 kB' 'SReclaimable: 183124 kB' 'SUnreclaim: 520276 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.960 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.960 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # continue 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.961 00:02:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.961 00:02:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.961 00:02:11 -- setup/common.sh@33 -- # echo 0 00:03:45.961 00:02:11 -- setup/common.sh@33 -- # return 0 00:03:45.961 00:02:11 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.961 00:02:11 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.961 00:02:11 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.961 00:02:11 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.961 00:02:11 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:45.961 node0=512 expecting 513 00:03:45.961 00:02:11 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.961 00:02:11 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.961 00:02:11 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.961 00:02:11 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:45.961 node1=513 expecting 512 00:03:45.961 00:02:11 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:45.961 00:03:45.961 real 0m3.337s 00:03:45.961 user 0m1.207s 00:03:45.961 sys 0m2.154s 00:03:45.961 00:02:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:45.961 00:02:11 -- common/autotest_common.sh@10 -- # set +x 00:03:45.961 ************************************ 00:03:45.961 END TEST odd_alloc 00:03:45.961 ************************************ 00:03:46.219 00:02:11 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:46.219 00:02:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:46.219 00:02:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:46.219 00:02:11 -- common/autotest_common.sh@10 -- # set +x 00:03:46.219 ************************************ 00:03:46.219 START TEST custom_alloc 00:03:46.219 ************************************ 00:03:46.219 00:02:11 -- common/autotest_common.sh@1114 -- # custom_alloc 00:03:46.219 00:02:11 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:46.219 00:02:11 -- setup/hugepages.sh@169 -- # local node 00:03:46.219 00:02:11 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:46.219 00:02:11 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:46.219 00:02:11 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:46.219 00:02:11 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:46.219 00:02:11 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:46.219 00:02:11 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:46.219 00:02:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.219 00:02:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:46.219 00:02:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.219 00:02:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.219 00:02:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:46.219 00:02:11 -- setup/hugepages.sh@83 -- # : 256 00:03:46.219 00:02:11 -- setup/hugepages.sh@84 -- # : 1 00:03:46.219 00:02:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:46.219 00:02:11 -- setup/hugepages.sh@83 -- # : 0 00:03:46.219 00:02:11 -- setup/hugepages.sh@84 -- # : 0 00:03:46.219 00:02:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:46.219 00:02:11 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:46.219 00:02:11 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:46.219 00:02:11 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:46.219 00:02:11 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:46.219 00:02:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.219 00:02:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:46.219 00:02:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.219 00:02:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.219 00:02:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:46.219 00:02:11 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:46.219 00:02:11 -- setup/hugepages.sh@78 -- # return 0 00:03:46.219 00:02:11 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:46.219 00:02:11 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:46.219 00:02:11 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:46.219 00:02:11 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:46.219 00:02:11 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:46.219 00:02:11 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:46.219 00:02:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.219 00:02:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:46.219 00:02:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.219 00:02:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.219 00:02:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.219 00:02:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:46.219 00:02:11 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:46.219 00:02:11 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:46.219 00:02:11 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:46.219 00:02:11 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:46.219 00:02:11 -- setup/hugepages.sh@78 -- # return 0 00:03:46.219 00:02:11 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:46.219 00:02:11 -- setup/hugepages.sh@187 -- # setup output 00:03:46.219 00:02:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.219 00:02:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:49.511 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.511 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:49.511 00:02:14 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:49.511 00:02:14 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:49.511 00:02:14 -- setup/hugepages.sh@89 -- # local node 00:03:49.511 00:02:14 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:49.511 00:02:14 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:49.511 00:02:14 -- setup/hugepages.sh@92 -- # local surp 00:03:49.511 00:02:14 -- setup/hugepages.sh@93 -- # local resv 00:03:49.511 00:02:14 -- setup/hugepages.sh@94 -- # local anon 00:03:49.511 00:02:14 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:49.511 00:02:14 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:49.511 00:02:14 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:49.511 00:02:14 -- setup/common.sh@18 -- # local node= 00:03:49.511 00:02:14 -- setup/common.sh@19 -- # local var val 00:03:49.511 00:02:14 -- setup/common.sh@20 -- # local mem_f mem 00:03:49.511 00:02:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.511 00:02:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.511 00:02:14 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.511 00:02:14 -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.511 00:02:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.511 00:02:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42596988 kB' 'MemAvailable: 44233604 kB' 'Buffers: 6816 kB' 'Cached: 9256044 kB' 'SwapCached: 248 kB' 'Active: 6705588 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797908 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597020 kB' 'Mapped: 126108 kB' 'Shmem: 7513956 kB' 'KReclaimable: 584512 kB' 'Slab: 1589464 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004952 kB' 'KernelStack: 21920 kB' 'PageTables: 8372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10045212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.511 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.511 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.512 00:02:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.512 00:02:14 -- setup/common.sh@33 -- # echo 0 00:03:49.512 00:02:14 -- setup/common.sh@33 -- # return 0 00:03:49.512 00:02:14 -- setup/hugepages.sh@97 -- # anon=0 00:03:49.512 00:02:14 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:49.512 00:02:14 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.512 00:02:14 -- setup/common.sh@18 -- # local node= 00:03:49.512 00:02:14 -- setup/common.sh@19 -- # local var val 00:03:49.512 00:02:14 -- setup/common.sh@20 -- # local mem_f mem 00:03:49.512 00:02:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.512 00:02:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.512 00:02:14 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.512 00:02:14 -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.512 00:02:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.512 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42600068 kB' 'MemAvailable: 44236684 kB' 'Buffers: 6816 kB' 'Cached: 9256048 kB' 'SwapCached: 248 kB' 'Active: 6704564 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796884 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596528 kB' 'Mapped: 126088 kB' 'Shmem: 7513960 kB' 'KReclaimable: 584512 kB' 'Slab: 1589480 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004968 kB' 'KernelStack: 21904 kB' 'PageTables: 8316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10045224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.513 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.513 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.514 00:02:14 -- setup/common.sh@33 -- # echo 0 00:03:49.514 00:02:14 -- setup/common.sh@33 -- # return 0 00:03:49.514 00:02:14 -- setup/hugepages.sh@99 -- # surp=0 00:03:49.514 00:02:14 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:49.514 00:02:14 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:49.514 00:02:14 -- setup/common.sh@18 -- # local node= 00:03:49.514 00:02:14 -- setup/common.sh@19 -- # local var val 00:03:49.514 00:02:14 -- setup/common.sh@20 -- # local mem_f mem 00:03:49.514 00:02:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.514 00:02:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.514 00:02:14 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.514 00:02:14 -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.514 00:02:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42600804 kB' 'MemAvailable: 44237420 kB' 'Buffers: 6816 kB' 'Cached: 9256048 kB' 'SwapCached: 248 kB' 'Active: 6704600 kB' 'Inactive: 3151864 kB' 'Active(anon): 5796920 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596588 kB' 'Mapped: 126088 kB' 'Shmem: 7513960 kB' 'KReclaimable: 584512 kB' 'Slab: 1589480 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004968 kB' 'KernelStack: 21920 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10045240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.514 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.514 00:02:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.515 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.515 00:02:14 -- setup/common.sh@33 -- # echo 0 00:03:49.515 00:02:14 -- setup/common.sh@33 -- # return 0 00:03:49.515 00:02:14 -- setup/hugepages.sh@100 -- # resv=0 00:03:49.515 00:02:14 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:49.515 nr_hugepages=1536 00:03:49.515 00:02:14 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:49.515 resv_hugepages=0 00:03:49.515 00:02:14 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:49.515 surplus_hugepages=0 00:03:49.515 00:02:14 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:49.515 anon_hugepages=0 00:03:49.515 00:02:14 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:49.515 00:02:14 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:49.515 00:02:14 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:49.515 00:02:14 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:49.515 00:02:14 -- setup/common.sh@18 -- # local node= 00:03:49.515 00:02:14 -- setup/common.sh@19 -- # local var val 00:03:49.515 00:02:14 -- setup/common.sh@20 -- # local mem_f mem 00:03:49.515 00:02:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.515 00:02:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.515 00:02:14 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.515 00:02:14 -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.515 00:02:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.515 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42601756 kB' 'MemAvailable: 44238372 kB' 'Buffers: 6816 kB' 'Cached: 9256048 kB' 'SwapCached: 248 kB' 'Active: 6705200 kB' 'Inactive: 3151864 kB' 'Active(anon): 5797520 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597084 kB' 'Mapped: 126088 kB' 'Shmem: 7513960 kB' 'KReclaimable: 584512 kB' 'Slab: 1589472 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1004960 kB' 'KernelStack: 22000 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10048428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.516 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.516 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.517 00:02:14 -- setup/common.sh@33 -- # echo 1536 00:03:49.517 00:02:14 -- setup/common.sh@33 -- # return 0 00:03:49.517 00:02:14 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:49.517 00:02:14 -- setup/hugepages.sh@112 -- # get_nodes 00:03:49.517 00:02:14 -- setup/hugepages.sh@27 -- # local node 00:03:49.517 00:02:14 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:49.517 00:02:14 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:49.517 00:02:14 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:49.517 00:02:14 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:49.517 00:02:14 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:49.517 00:02:14 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:49.517 00:02:14 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:49.517 00:02:14 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:49.517 00:02:14 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:49.517 00:02:14 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.517 00:02:14 -- setup/common.sh@18 -- # local node=0 00:03:49.517 00:02:14 -- setup/common.sh@19 -- # local var val 00:03:49.517 00:02:14 -- setup/common.sh@20 -- # local mem_f mem 00:03:49.517 00:02:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.517 00:02:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:49.517 00:02:14 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:49.517 00:02:14 -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.517 00:02:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24306984 kB' 'MemUsed: 8327452 kB' 'SwapCached: 148 kB' 'Active: 4399660 kB' 'Inactive: 535724 kB' 'Active(anon): 3621896 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673828 kB' 'Mapped: 69212 kB' 'AnonPages: 264820 kB' 'Shmem: 3360712 kB' 'KernelStack: 10776 kB' 'PageTables: 4452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 886096 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 484708 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.517 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.517 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:14 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:14 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@33 -- # echo 0 00:03:49.518 00:02:15 -- setup/common.sh@33 -- # return 0 00:03:49.518 00:02:15 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:49.518 00:02:15 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:49.518 00:02:15 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:49.518 00:02:15 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:49.518 00:02:15 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.518 00:02:15 -- setup/common.sh@18 -- # local node=1 00:03:49.518 00:02:15 -- setup/common.sh@19 -- # local var val 00:03:49.518 00:02:15 -- setup/common.sh@20 -- # local mem_f mem 00:03:49.518 00:02:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.518 00:02:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:49.518 00:02:15 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:49.518 00:02:15 -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.518 00:02:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18298780 kB' 'MemUsed: 9350580 kB' 'SwapCached: 100 kB' 'Active: 2305712 kB' 'Inactive: 2616140 kB' 'Active(anon): 2175796 kB' 'Inactive(anon): 2310120 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4589340 kB' 'Mapped: 56876 kB' 'AnonPages: 332644 kB' 'Shmem: 4153304 kB' 'KernelStack: 11144 kB' 'PageTables: 3940 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183124 kB' 'Slab: 703368 kB' 'SReclaimable: 183124 kB' 'SUnreclaim: 520244 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.518 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.518 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # continue 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:49.519 00:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:49.519 00:02:15 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.519 00:02:15 -- setup/common.sh@33 -- # echo 0 00:03:49.519 00:02:15 -- setup/common.sh@33 -- # return 0 00:03:49.519 00:02:15 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:49.519 00:02:15 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:49.519 00:02:15 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:49.519 00:02:15 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:49.519 00:02:15 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:49.519 node0=512 expecting 512 00:03:49.519 00:02:15 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:49.519 00:02:15 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:49.519 00:02:15 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:49.519 00:02:15 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:49.519 node1=1024 expecting 1024 00:03:49.519 00:02:15 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:49.519 00:03:49.519 real 0m3.486s 00:03:49.519 user 0m1.302s 00:03:49.519 sys 0m2.236s 00:03:49.519 00:02:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:49.519 00:02:15 -- common/autotest_common.sh@10 -- # set +x 00:03:49.519 ************************************ 00:03:49.519 END TEST custom_alloc 00:03:49.519 ************************************ 00:03:49.778 00:02:15 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:49.778 00:02:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:49.778 00:02:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:49.778 00:02:15 -- common/autotest_common.sh@10 -- # set +x 00:03:49.778 ************************************ 00:03:49.778 START TEST no_shrink_alloc 00:03:49.778 ************************************ 00:03:49.778 00:02:15 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:03:49.778 00:02:15 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:49.778 00:02:15 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:49.778 00:02:15 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:49.778 00:02:15 -- setup/hugepages.sh@51 -- # shift 00:03:49.778 00:02:15 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:49.778 00:02:15 -- setup/hugepages.sh@52 -- # local node_ids 00:03:49.778 00:02:15 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:49.778 00:02:15 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:49.778 00:02:15 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:49.778 00:02:15 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:49.778 00:02:15 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:49.778 00:02:15 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:49.778 00:02:15 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:49.778 00:02:15 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:49.778 00:02:15 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:49.778 00:02:15 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:49.778 00:02:15 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:49.778 00:02:15 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:49.778 00:02:15 -- setup/hugepages.sh@73 -- # return 0 00:03:49.778 00:02:15 -- setup/hugepages.sh@198 -- # setup output 00:03:49.778 00:02:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:49.778 00:02:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:52.313 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:52.313 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:52.314 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:52.314 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:52.314 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:52.314 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:52.575 00:02:17 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:52.576 00:02:17 -- setup/hugepages.sh@89 -- # local node 00:03:52.576 00:02:17 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:52.576 00:02:17 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:52.576 00:02:17 -- setup/hugepages.sh@92 -- # local surp 00:03:52.576 00:02:17 -- setup/hugepages.sh@93 -- # local resv 00:03:52.576 00:02:17 -- setup/hugepages.sh@94 -- # local anon 00:03:52.576 00:02:17 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:52.576 00:02:17 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:52.576 00:02:17 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:52.576 00:02:17 -- setup/common.sh@18 -- # local node= 00:03:52.576 00:02:17 -- setup/common.sh@19 -- # local var val 00:03:52.576 00:02:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.576 00:02:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.576 00:02:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.576 00:02:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.576 00:02:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.576 00:02:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43653908 kB' 'MemAvailable: 45290524 kB' 'Buffers: 6816 kB' 'Cached: 9256172 kB' 'SwapCached: 248 kB' 'Active: 6706180 kB' 'Inactive: 3151864 kB' 'Active(anon): 5798500 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598200 kB' 'Mapped: 126168 kB' 'Shmem: 7514084 kB' 'KReclaimable: 584512 kB' 'Slab: 1589580 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005068 kB' 'KernelStack: 21920 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10045864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.576 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.576 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.577 00:02:17 -- setup/common.sh@33 -- # echo 0 00:03:52.577 00:02:17 -- setup/common.sh@33 -- # return 0 00:03:52.577 00:02:17 -- setup/hugepages.sh@97 -- # anon=0 00:03:52.577 00:02:17 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:52.577 00:02:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.577 00:02:17 -- setup/common.sh@18 -- # local node= 00:03:52.577 00:02:17 -- setup/common.sh@19 -- # local var val 00:03:52.577 00:02:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.577 00:02:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.577 00:02:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.577 00:02:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.577 00:02:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.577 00:02:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43653912 kB' 'MemAvailable: 45290528 kB' 'Buffers: 6816 kB' 'Cached: 9256180 kB' 'SwapCached: 248 kB' 'Active: 6707396 kB' 'Inactive: 3151864 kB' 'Active(anon): 5799716 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599860 kB' 'Mapped: 126668 kB' 'Shmem: 7514092 kB' 'KReclaimable: 584512 kB' 'Slab: 1589564 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005052 kB' 'KernelStack: 21904 kB' 'PageTables: 8308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10048428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.577 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.577 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.578 00:02:17 -- setup/common.sh@33 -- # echo 0 00:03:52.578 00:02:17 -- setup/common.sh@33 -- # return 0 00:03:52.578 00:02:17 -- setup/hugepages.sh@99 -- # surp=0 00:03:52.578 00:02:17 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:52.578 00:02:17 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:52.578 00:02:17 -- setup/common.sh@18 -- # local node= 00:03:52.578 00:02:17 -- setup/common.sh@19 -- # local var val 00:03:52.578 00:02:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.578 00:02:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.578 00:02:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.578 00:02:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.578 00:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.578 00:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43648672 kB' 'MemAvailable: 45285288 kB' 'Buffers: 6816 kB' 'Cached: 9256192 kB' 'SwapCached: 248 kB' 'Active: 6706428 kB' 'Inactive: 3151864 kB' 'Active(anon): 5798748 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598368 kB' 'Mapped: 126084 kB' 'Shmem: 7514104 kB' 'KReclaimable: 584512 kB' 'Slab: 1589548 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005036 kB' 'KernelStack: 21952 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10048560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.578 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.578 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.579 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.579 00:02:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.580 00:02:18 -- setup/common.sh@33 -- # echo 0 00:03:52.580 00:02:18 -- setup/common.sh@33 -- # return 0 00:03:52.580 00:02:18 -- setup/hugepages.sh@100 -- # resv=0 00:03:52.580 00:02:18 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:52.580 nr_hugepages=1024 00:03:52.580 00:02:18 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:52.580 resv_hugepages=0 00:03:52.580 00:02:18 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:52.580 surplus_hugepages=0 00:03:52.580 00:02:18 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:52.580 anon_hugepages=0 00:03:52.580 00:02:18 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:52.580 00:02:18 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:52.580 00:02:18 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:52.580 00:02:18 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:52.580 00:02:18 -- setup/common.sh@18 -- # local node= 00:03:52.580 00:02:18 -- setup/common.sh@19 -- # local var val 00:03:52.580 00:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.580 00:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.580 00:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.580 00:02:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.580 00:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.580 00:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43642824 kB' 'MemAvailable: 45279440 kB' 'Buffers: 6816 kB' 'Cached: 9256208 kB' 'SwapCached: 248 kB' 'Active: 6712624 kB' 'Inactive: 3151864 kB' 'Active(anon): 5804944 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604780 kB' 'Mapped: 127092 kB' 'Shmem: 7514120 kB' 'KReclaimable: 584512 kB' 'Slab: 1589548 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005036 kB' 'KernelStack: 22000 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10057616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218088 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.580 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.580 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.581 00:02:18 -- setup/common.sh@33 -- # echo 1024 00:03:52.581 00:02:18 -- setup/common.sh@33 -- # return 0 00:03:52.581 00:02:18 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:52.581 00:02:18 -- setup/hugepages.sh@112 -- # get_nodes 00:03:52.581 00:02:18 -- setup/hugepages.sh@27 -- # local node 00:03:52.581 00:02:18 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.581 00:02:18 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:52.581 00:02:18 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.581 00:02:18 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:52.581 00:02:18 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:52.581 00:02:18 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:52.581 00:02:18 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:52.581 00:02:18 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:52.581 00:02:18 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:52.581 00:02:18 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.581 00:02:18 -- setup/common.sh@18 -- # local node=0 00:03:52.581 00:02:18 -- setup/common.sh@19 -- # local var val 00:03:52.581 00:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.581 00:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.581 00:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:52.581 00:02:18 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:52.581 00:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.581 00:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23246408 kB' 'MemUsed: 9388028 kB' 'SwapCached: 148 kB' 'Active: 4400452 kB' 'Inactive: 535724 kB' 'Active(anon): 3622688 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673856 kB' 'Mapped: 69208 kB' 'AnonPages: 265476 kB' 'Shmem: 3360740 kB' 'KernelStack: 10776 kB' 'PageTables: 4436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 886056 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 484668 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.581 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.581 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # continue 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.582 00:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.582 00:02:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.582 00:02:18 -- setup/common.sh@33 -- # echo 0 00:03:52.582 00:02:18 -- setup/common.sh@33 -- # return 0 00:03:52.582 00:02:18 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:52.582 00:02:18 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:52.582 00:02:18 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:52.582 00:02:18 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:52.582 00:02:18 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:52.582 node0=1024 expecting 1024 00:03:52.582 00:02:18 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:52.582 00:02:18 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:52.582 00:02:18 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:52.582 00:02:18 -- setup/hugepages.sh@202 -- # setup output 00:03:52.582 00:02:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.582 00:02:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:55.871 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:55.871 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:55.871 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:56.133 00:02:21 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:56.133 00:02:21 -- setup/hugepages.sh@89 -- # local node 00:03:56.133 00:02:21 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:56.133 00:02:21 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:56.133 00:02:21 -- setup/hugepages.sh@92 -- # local surp 00:03:56.133 00:02:21 -- setup/hugepages.sh@93 -- # local resv 00:03:56.133 00:02:21 -- setup/hugepages.sh@94 -- # local anon 00:03:56.133 00:02:21 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:56.133 00:02:21 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:56.133 00:02:21 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:56.134 00:02:21 -- setup/common.sh@18 -- # local node= 00:03:56.134 00:02:21 -- setup/common.sh@19 -- # local var val 00:03:56.134 00:02:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.134 00:02:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.134 00:02:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.134 00:02:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.134 00:02:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.134 00:02:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43636036 kB' 'MemAvailable: 45272652 kB' 'Buffers: 6816 kB' 'Cached: 9256292 kB' 'SwapCached: 248 kB' 'Active: 6712844 kB' 'Inactive: 3151864 kB' 'Active(anon): 5805164 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 605244 kB' 'Mapped: 126652 kB' 'Shmem: 7514204 kB' 'KReclaimable: 584512 kB' 'Slab: 1589624 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005112 kB' 'KernelStack: 22160 kB' 'PageTables: 8356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10057176 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218248 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.134 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.134 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.135 00:02:21 -- setup/common.sh@33 -- # echo 0 00:03:56.135 00:02:21 -- setup/common.sh@33 -- # return 0 00:03:56.135 00:02:21 -- setup/hugepages.sh@97 -- # anon=0 00:03:56.135 00:02:21 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:56.135 00:02:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.135 00:02:21 -- setup/common.sh@18 -- # local node= 00:03:56.135 00:02:21 -- setup/common.sh@19 -- # local var val 00:03:56.135 00:02:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.135 00:02:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.135 00:02:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.135 00:02:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.135 00:02:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.135 00:02:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43636292 kB' 'MemAvailable: 45272908 kB' 'Buffers: 6816 kB' 'Cached: 9256296 kB' 'SwapCached: 248 kB' 'Active: 6707248 kB' 'Inactive: 3151864 kB' 'Active(anon): 5799568 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599152 kB' 'Mapped: 126636 kB' 'Shmem: 7514208 kB' 'KReclaimable: 584512 kB' 'Slab: 1589800 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005288 kB' 'KernelStack: 22112 kB' 'PageTables: 8924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10051068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.135 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.135 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.136 00:02:21 -- setup/common.sh@33 -- # echo 0 00:03:56.136 00:02:21 -- setup/common.sh@33 -- # return 0 00:03:56.136 00:02:21 -- setup/hugepages.sh@99 -- # surp=0 00:03:56.136 00:02:21 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:56.136 00:02:21 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:56.136 00:02:21 -- setup/common.sh@18 -- # local node= 00:03:56.136 00:02:21 -- setup/common.sh@19 -- # local var val 00:03:56.136 00:02:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.136 00:02:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.136 00:02:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.136 00:02:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.136 00:02:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.136 00:02:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.136 00:02:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43634988 kB' 'MemAvailable: 45271604 kB' 'Buffers: 6816 kB' 'Cached: 9256308 kB' 'SwapCached: 248 kB' 'Active: 6707556 kB' 'Inactive: 3151864 kB' 'Active(anon): 5799876 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599512 kB' 'Mapped: 126100 kB' 'Shmem: 7514220 kB' 'KReclaimable: 584512 kB' 'Slab: 1589720 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005208 kB' 'KernelStack: 22080 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10051080 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.136 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.136 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.137 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.137 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.138 00:02:21 -- setup/common.sh@33 -- # echo 0 00:03:56.138 00:02:21 -- setup/common.sh@33 -- # return 0 00:03:56.138 00:02:21 -- setup/hugepages.sh@100 -- # resv=0 00:03:56.138 00:02:21 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:56.138 nr_hugepages=1024 00:03:56.138 00:02:21 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:56.138 resv_hugepages=0 00:03:56.138 00:02:21 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:56.138 surplus_hugepages=0 00:03:56.138 00:02:21 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:56.138 anon_hugepages=0 00:03:56.138 00:02:21 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.138 00:02:21 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:56.138 00:02:21 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:56.138 00:02:21 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:56.138 00:02:21 -- setup/common.sh@18 -- # local node= 00:03:56.138 00:02:21 -- setup/common.sh@19 -- # local var val 00:03:56.138 00:02:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.138 00:02:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.138 00:02:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.138 00:02:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.138 00:02:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.138 00:02:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43634104 kB' 'MemAvailable: 45270720 kB' 'Buffers: 6816 kB' 'Cached: 9256324 kB' 'SwapCached: 248 kB' 'Active: 6706796 kB' 'Inactive: 3151864 kB' 'Active(anon): 5799116 kB' 'Inactive(anon): 2310640 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598640 kB' 'Mapped: 126100 kB' 'Shmem: 7514236 kB' 'KReclaimable: 584512 kB' 'Slab: 1589688 kB' 'SReclaimable: 584512 kB' 'SUnreclaim: 1005176 kB' 'KernelStack: 21936 kB' 'PageTables: 8368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10051096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 120064 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.138 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.138 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.139 00:02:21 -- setup/common.sh@33 -- # echo 1024 00:03:56.139 00:02:21 -- setup/common.sh@33 -- # return 0 00:03:56.139 00:02:21 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.139 00:02:21 -- setup/hugepages.sh@112 -- # get_nodes 00:03:56.139 00:02:21 -- setup/hugepages.sh@27 -- # local node 00:03:56.139 00:02:21 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.139 00:02:21 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.139 00:02:21 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.139 00:02:21 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:56.139 00:02:21 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:56.139 00:02:21 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:56.139 00:02:21 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:56.139 00:02:21 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:56.139 00:02:21 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:56.139 00:02:21 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.139 00:02:21 -- setup/common.sh@18 -- # local node=0 00:03:56.139 00:02:21 -- setup/common.sh@19 -- # local var val 00:03:56.139 00:02:21 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.139 00:02:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.139 00:02:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:56.139 00:02:21 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:56.139 00:02:21 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.139 00:02:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23235364 kB' 'MemUsed: 9399072 kB' 'SwapCached: 148 kB' 'Active: 4400156 kB' 'Inactive: 535724 kB' 'Active(anon): 3622392 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4673916 kB' 'Mapped: 69224 kB' 'AnonPages: 265348 kB' 'Shmem: 3360800 kB' 'KernelStack: 10808 kB' 'PageTables: 4488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 401388 kB' 'Slab: 885892 kB' 'SReclaimable: 401388 kB' 'SUnreclaim: 484504 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.139 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.139 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # continue 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.140 00:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.140 00:02:21 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.140 00:02:21 -- setup/common.sh@33 -- # echo 0 00:03:56.140 00:02:21 -- setup/common.sh@33 -- # return 0 00:03:56.140 00:02:21 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:56.140 00:02:21 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:56.140 00:02:21 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:56.140 00:02:21 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:56.140 00:02:21 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:56.140 node0=1024 expecting 1024 00:03:56.140 00:02:21 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:56.140 00:03:56.140 real 0m6.556s 00:03:56.140 user 0m2.348s 00:03:56.140 sys 0m4.131s 00:03:56.140 00:02:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:56.140 00:02:21 -- common/autotest_common.sh@10 -- # set +x 00:03:56.140 ************************************ 00:03:56.140 END TEST no_shrink_alloc 00:03:56.140 ************************************ 00:03:56.140 00:02:21 -- setup/hugepages.sh@217 -- # clear_hp 00:03:56.140 00:02:21 -- setup/hugepages.sh@37 -- # local node hp 00:03:56.140 00:02:21 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:56.140 00:02:21 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.140 00:02:21 -- setup/hugepages.sh@41 -- # echo 0 00:03:56.140 00:02:21 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.140 00:02:21 -- setup/hugepages.sh@41 -- # echo 0 00:03:56.399 00:02:21 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:56.399 00:02:21 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.399 00:02:21 -- setup/hugepages.sh@41 -- # echo 0 00:03:56.399 00:02:21 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.399 00:02:21 -- setup/hugepages.sh@41 -- # echo 0 00:03:56.399 00:02:21 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:56.399 00:02:21 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:56.399 00:03:56.399 real 0m25.846s 00:03:56.399 user 0m9.012s 00:03:56.399 sys 0m15.561s 00:03:56.399 00:02:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:56.399 00:02:21 -- common/autotest_common.sh@10 -- # set +x 00:03:56.399 ************************************ 00:03:56.399 END TEST hugepages 00:03:56.399 ************************************ 00:03:56.399 00:02:21 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:56.399 00:02:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:56.399 00:02:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:56.399 00:02:21 -- common/autotest_common.sh@10 -- # set +x 00:03:56.399 ************************************ 00:03:56.399 START TEST driver 00:03:56.399 ************************************ 00:03:56.399 00:02:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:56.399 * Looking for test storage... 00:03:56.399 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:56.399 00:02:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:56.400 00:02:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:56.400 00:02:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:56.400 00:02:21 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:56.400 00:02:21 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:56.400 00:02:21 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:56.400 00:02:21 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:56.400 00:02:21 -- scripts/common.sh@335 -- # IFS=.-: 00:03:56.400 00:02:21 -- scripts/common.sh@335 -- # read -ra ver1 00:03:56.400 00:02:21 -- scripts/common.sh@336 -- # IFS=.-: 00:03:56.400 00:02:21 -- scripts/common.sh@336 -- # read -ra ver2 00:03:56.400 00:02:21 -- scripts/common.sh@337 -- # local 'op=<' 00:03:56.400 00:02:21 -- scripts/common.sh@339 -- # ver1_l=2 00:03:56.400 00:02:21 -- scripts/common.sh@340 -- # ver2_l=1 00:03:56.400 00:02:21 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:56.400 00:02:21 -- scripts/common.sh@343 -- # case "$op" in 00:03:56.400 00:02:21 -- scripts/common.sh@344 -- # : 1 00:03:56.400 00:02:21 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:56.400 00:02:21 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:56.400 00:02:21 -- scripts/common.sh@364 -- # decimal 1 00:03:56.400 00:02:21 -- scripts/common.sh@352 -- # local d=1 00:03:56.400 00:02:21 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:56.400 00:02:21 -- scripts/common.sh@354 -- # echo 1 00:03:56.400 00:02:21 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:56.400 00:02:21 -- scripts/common.sh@365 -- # decimal 2 00:03:56.400 00:02:21 -- scripts/common.sh@352 -- # local d=2 00:03:56.400 00:02:21 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:56.400 00:02:21 -- scripts/common.sh@354 -- # echo 2 00:03:56.400 00:02:21 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:56.400 00:02:21 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:56.400 00:02:21 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:56.400 00:02:21 -- scripts/common.sh@367 -- # return 0 00:03:56.400 00:02:21 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:56.400 00:02:21 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:56.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.400 --rc genhtml_branch_coverage=1 00:03:56.400 --rc genhtml_function_coverage=1 00:03:56.400 --rc genhtml_legend=1 00:03:56.400 --rc geninfo_all_blocks=1 00:03:56.400 --rc geninfo_unexecuted_blocks=1 00:03:56.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.400 ' 00:03:56.400 00:02:21 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:56.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.400 --rc genhtml_branch_coverage=1 00:03:56.400 --rc genhtml_function_coverage=1 00:03:56.400 --rc genhtml_legend=1 00:03:56.400 --rc geninfo_all_blocks=1 00:03:56.400 --rc geninfo_unexecuted_blocks=1 00:03:56.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.400 ' 00:03:56.400 00:02:21 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:56.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.400 --rc genhtml_branch_coverage=1 00:03:56.400 --rc genhtml_function_coverage=1 00:03:56.400 --rc genhtml_legend=1 00:03:56.400 --rc geninfo_all_blocks=1 00:03:56.400 --rc geninfo_unexecuted_blocks=1 00:03:56.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.400 ' 00:03:56.400 00:02:21 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:56.400 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.400 --rc genhtml_branch_coverage=1 00:03:56.400 --rc genhtml_function_coverage=1 00:03:56.400 --rc genhtml_legend=1 00:03:56.400 --rc geninfo_all_blocks=1 00:03:56.400 --rc geninfo_unexecuted_blocks=1 00:03:56.400 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.400 ' 00:03:56.400 00:02:21 -- setup/driver.sh@68 -- # setup reset 00:03:56.400 00:02:21 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:56.400 00:02:21 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:01.689 00:02:26 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:01.689 00:02:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:01.689 00:02:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:01.689 00:02:26 -- common/autotest_common.sh@10 -- # set +x 00:04:01.689 ************************************ 00:04:01.689 START TEST guess_driver 00:04:01.689 ************************************ 00:04:01.689 00:02:26 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:01.689 00:02:26 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:01.689 00:02:26 -- setup/driver.sh@47 -- # local fail=0 00:04:01.689 00:02:26 -- setup/driver.sh@49 -- # pick_driver 00:04:01.689 00:02:26 -- setup/driver.sh@36 -- # vfio 00:04:01.689 00:02:26 -- setup/driver.sh@21 -- # local iommu_grups 00:04:01.689 00:02:26 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:01.689 00:02:26 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:01.689 00:02:26 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:01.689 00:02:26 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:01.689 00:02:26 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:01.689 00:02:26 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:01.689 00:02:26 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:01.689 00:02:26 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:01.689 00:02:26 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:01.689 00:02:26 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:01.689 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:01.689 00:02:26 -- setup/driver.sh@30 -- # return 0 00:04:01.689 00:02:26 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:01.689 00:02:26 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:01.689 00:02:26 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:01.689 00:02:26 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:01.689 Looking for driver=vfio-pci 00:04:01.690 00:02:26 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.690 00:02:26 -- setup/driver.sh@45 -- # setup output config 00:04:01.690 00:02:26 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.690 00:02:26 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:04.981 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.981 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.981 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.981 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.981 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.981 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.981 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.981 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.981 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.981 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.981 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.981 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.981 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.981 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.981 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.981 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.981 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.981 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:29 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:29 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:29 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:30 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:30 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:30 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:30 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:30 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:30 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:30 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:30 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:30 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.982 00:02:30 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.982 00:02:30 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.982 00:02:30 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:06.375 00:02:31 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:06.375 00:02:31 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:06.375 00:02:31 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:06.375 00:02:31 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:06.375 00:02:31 -- setup/driver.sh@65 -- # setup reset 00:04:06.375 00:02:31 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:06.375 00:02:31 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:11.720 00:04:11.720 real 0m9.826s 00:04:11.720 user 0m2.513s 00:04:11.720 sys 0m5.025s 00:04:11.720 00:02:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.720 00:02:36 -- common/autotest_common.sh@10 -- # set +x 00:04:11.720 ************************************ 00:04:11.720 END TEST guess_driver 00:04:11.720 ************************************ 00:04:11.720 00:04:11.720 real 0m14.535s 00:04:11.720 user 0m3.807s 00:04:11.720 sys 0m7.629s 00:04:11.720 00:02:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.720 00:02:36 -- common/autotest_common.sh@10 -- # set +x 00:04:11.720 ************************************ 00:04:11.720 END TEST driver 00:04:11.720 ************************************ 00:04:11.720 00:02:36 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:11.720 00:02:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.720 00:02:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.720 00:02:36 -- common/autotest_common.sh@10 -- # set +x 00:04:11.720 ************************************ 00:04:11.720 START TEST devices 00:04:11.720 ************************************ 00:04:11.720 00:02:36 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:11.720 * Looking for test storage... 00:04:11.720 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:11.720 00:02:36 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:11.720 00:02:36 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:11.720 00:02:36 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:11.720 00:02:36 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:11.720 00:02:36 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:11.720 00:02:36 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:11.720 00:02:36 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:11.720 00:02:36 -- scripts/common.sh@335 -- # IFS=.-: 00:04:11.720 00:02:36 -- scripts/common.sh@335 -- # read -ra ver1 00:04:11.720 00:02:36 -- scripts/common.sh@336 -- # IFS=.-: 00:04:11.720 00:02:36 -- scripts/common.sh@336 -- # read -ra ver2 00:04:11.720 00:02:36 -- scripts/common.sh@337 -- # local 'op=<' 00:04:11.720 00:02:36 -- scripts/common.sh@339 -- # ver1_l=2 00:04:11.720 00:02:36 -- scripts/common.sh@340 -- # ver2_l=1 00:04:11.720 00:02:36 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:11.720 00:02:36 -- scripts/common.sh@343 -- # case "$op" in 00:04:11.720 00:02:36 -- scripts/common.sh@344 -- # : 1 00:04:11.720 00:02:36 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:11.720 00:02:36 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:11.720 00:02:36 -- scripts/common.sh@364 -- # decimal 1 00:04:11.720 00:02:36 -- scripts/common.sh@352 -- # local d=1 00:04:11.720 00:02:36 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:11.720 00:02:36 -- scripts/common.sh@354 -- # echo 1 00:04:11.720 00:02:36 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:11.720 00:02:36 -- scripts/common.sh@365 -- # decimal 2 00:04:11.720 00:02:36 -- scripts/common.sh@352 -- # local d=2 00:04:11.720 00:02:36 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:11.720 00:02:36 -- scripts/common.sh@354 -- # echo 2 00:04:11.720 00:02:36 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:11.720 00:02:36 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:11.720 00:02:36 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:11.720 00:02:36 -- scripts/common.sh@367 -- # return 0 00:04:11.720 00:02:36 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:11.720 00:02:36 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:11.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.720 --rc genhtml_branch_coverage=1 00:04:11.720 --rc genhtml_function_coverage=1 00:04:11.720 --rc genhtml_legend=1 00:04:11.720 --rc geninfo_all_blocks=1 00:04:11.720 --rc geninfo_unexecuted_blocks=1 00:04:11.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.720 ' 00:04:11.720 00:02:36 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:11.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.720 --rc genhtml_branch_coverage=1 00:04:11.720 --rc genhtml_function_coverage=1 00:04:11.720 --rc genhtml_legend=1 00:04:11.720 --rc geninfo_all_blocks=1 00:04:11.720 --rc geninfo_unexecuted_blocks=1 00:04:11.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.720 ' 00:04:11.720 00:02:36 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:11.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.720 --rc genhtml_branch_coverage=1 00:04:11.720 --rc genhtml_function_coverage=1 00:04:11.720 --rc genhtml_legend=1 00:04:11.720 --rc geninfo_all_blocks=1 00:04:11.720 --rc geninfo_unexecuted_blocks=1 00:04:11.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.720 ' 00:04:11.720 00:02:36 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:11.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.720 --rc genhtml_branch_coverage=1 00:04:11.720 --rc genhtml_function_coverage=1 00:04:11.720 --rc genhtml_legend=1 00:04:11.720 --rc geninfo_all_blocks=1 00:04:11.720 --rc geninfo_unexecuted_blocks=1 00:04:11.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.720 ' 00:04:11.720 00:02:36 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:11.720 00:02:36 -- setup/devices.sh@192 -- # setup reset 00:04:11.720 00:02:36 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:11.720 00:02:36 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:15.070 00:02:40 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:15.070 00:02:40 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:15.070 00:02:40 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:15.070 00:02:40 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:15.070 00:02:40 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:15.070 00:02:40 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:15.070 00:02:40 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:15.070 00:02:40 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:15.070 00:02:40 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:15.070 00:02:40 -- setup/devices.sh@196 -- # blocks=() 00:04:15.070 00:02:40 -- setup/devices.sh@196 -- # declare -a blocks 00:04:15.070 00:02:40 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:15.070 00:02:40 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:15.070 00:02:40 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:15.070 00:02:40 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:15.070 00:02:40 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:15.070 00:02:40 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:15.070 00:02:40 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:15.070 00:02:40 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:15.070 00:02:40 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:15.070 00:02:40 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:15.070 00:02:40 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:15.070 No valid GPT data, bailing 00:04:15.070 00:02:40 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:15.070 00:02:40 -- scripts/common.sh@393 -- # pt= 00:04:15.070 00:02:40 -- scripts/common.sh@394 -- # return 1 00:04:15.070 00:02:40 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:15.070 00:02:40 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:15.070 00:02:40 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:15.070 00:02:40 -- setup/common.sh@80 -- # echo 1600321314816 00:04:15.070 00:02:40 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:15.070 00:02:40 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:15.070 00:02:40 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:15.070 00:02:40 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:15.070 00:02:40 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:15.070 00:02:40 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:15.070 00:02:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:15.070 00:02:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:15.070 00:02:40 -- common/autotest_common.sh@10 -- # set +x 00:04:15.070 ************************************ 00:04:15.070 START TEST nvme_mount 00:04:15.070 ************************************ 00:04:15.070 00:02:40 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:15.070 00:02:40 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:15.070 00:02:40 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:15.070 00:02:40 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.070 00:02:40 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.070 00:02:40 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:15.070 00:02:40 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:15.070 00:02:40 -- setup/common.sh@40 -- # local part_no=1 00:04:15.070 00:02:40 -- setup/common.sh@41 -- # local size=1073741824 00:04:15.070 00:02:40 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:15.070 00:02:40 -- setup/common.sh@44 -- # parts=() 00:04:15.070 00:02:40 -- setup/common.sh@44 -- # local parts 00:04:15.070 00:02:40 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:15.070 00:02:40 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:15.070 00:02:40 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:15.070 00:02:40 -- setup/common.sh@46 -- # (( part++ )) 00:04:15.070 00:02:40 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:15.070 00:02:40 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:15.070 00:02:40 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:15.070 00:02:40 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:16.014 Creating new GPT entries in memory. 00:04:16.014 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:16.014 other utilities. 00:04:16.015 00:02:41 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:16.015 00:02:41 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:16.015 00:02:41 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:16.015 00:02:41 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:16.015 00:02:41 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:17.396 Creating new GPT entries in memory. 00:04:17.396 The operation has completed successfully. 00:04:17.396 00:02:42 -- setup/common.sh@57 -- # (( part++ )) 00:04:17.396 00:02:42 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:17.396 00:02:42 -- setup/common.sh@62 -- # wait 2680064 00:04:17.396 00:02:42 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.396 00:02:42 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:17.396 00:02:42 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.396 00:02:42 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:17.396 00:02:42 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:17.396 00:02:42 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.396 00:02:42 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.396 00:02:42 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:17.396 00:02:42 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:17.396 00:02:42 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.396 00:02:42 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.396 00:02:42 -- setup/devices.sh@53 -- # local found=0 00:04:17.396 00:02:42 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:17.396 00:02:42 -- setup/devices.sh@56 -- # : 00:04:17.396 00:02:42 -- setup/devices.sh@59 -- # local pci status 00:04:17.396 00:02:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.396 00:02:42 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:17.396 00:02:42 -- setup/devices.sh@47 -- # setup output config 00:04:17.396 00:02:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.396 00:02:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:20.680 00:02:45 -- setup/devices.sh@63 -- # found=1 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.680 00:02:45 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:20.680 00:02:45 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:20.680 00:02:45 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.680 00:02:45 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:20.680 00:02:45 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:20.680 00:02:45 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:20.680 00:02:45 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.680 00:02:45 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.680 00:02:45 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:20.680 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:20.680 00:02:45 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:20.680 00:02:45 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:20.939 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:20.939 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:20.939 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:20.939 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:20.939 00:02:46 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:20.939 00:02:46 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:20.939 00:02:46 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.939 00:02:46 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:20.939 00:02:46 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:20.939 00:02:46 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.939 00:02:46 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:20.939 00:02:46 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:20.939 00:02:46 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:20.939 00:02:46 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.939 00:02:46 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:20.939 00:02:46 -- setup/devices.sh@53 -- # local found=0 00:04:20.939 00:02:46 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:20.939 00:02:46 -- setup/devices.sh@56 -- # : 00:04:20.939 00:02:46 -- setup/devices.sh@59 -- # local pci status 00:04:20.939 00:02:46 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.939 00:02:46 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:20.939 00:02:46 -- setup/devices.sh@47 -- # setup output config 00:04:20.939 00:02:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.939 00:02:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:24.228 00:02:49 -- setup/devices.sh@63 -- # found=1 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:24.228 00:02:49 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:24.228 00:02:49 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.228 00:02:49 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:24.228 00:02:49 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:24.228 00:02:49 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:24.228 00:02:49 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:24.228 00:02:49 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:24.228 00:02:49 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:24.228 00:02:49 -- setup/devices.sh@50 -- # local mount_point= 00:04:24.228 00:02:49 -- setup/devices.sh@51 -- # local test_file= 00:04:24.228 00:02:49 -- setup/devices.sh@53 -- # local found=0 00:04:24.228 00:02:49 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:24.228 00:02:49 -- setup/devices.sh@59 -- # local pci status 00:04:24.228 00:02:49 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.228 00:02:49 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:24.228 00:02:49 -- setup/devices.sh@47 -- # setup output config 00:04:24.228 00:02:49 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.228 00:02:49 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:27.516 00:02:52 -- setup/devices.sh@63 -- # found=1 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.516 00:02:52 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:27.516 00:02:52 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:27.516 00:02:52 -- setup/devices.sh@68 -- # return 0 00:04:27.516 00:02:52 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:27.516 00:02:52 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:27.516 00:02:52 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:27.516 00:02:52 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:27.516 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:27.516 00:04:27.516 real 0m12.225s 00:04:27.516 user 0m3.450s 00:04:27.516 sys 0m6.651s 00:04:27.516 00:02:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:27.517 00:02:52 -- common/autotest_common.sh@10 -- # set +x 00:04:27.517 ************************************ 00:04:27.517 END TEST nvme_mount 00:04:27.517 ************************************ 00:04:27.517 00:02:52 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:27.517 00:02:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:27.517 00:02:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:27.517 00:02:52 -- common/autotest_common.sh@10 -- # set +x 00:04:27.517 ************************************ 00:04:27.517 START TEST dm_mount 00:04:27.517 ************************************ 00:04:27.517 00:02:52 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:27.517 00:02:52 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:27.517 00:02:52 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:27.517 00:02:52 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:27.517 00:02:52 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:27.517 00:02:52 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:27.517 00:02:52 -- setup/common.sh@40 -- # local part_no=2 00:04:27.517 00:02:52 -- setup/common.sh@41 -- # local size=1073741824 00:04:27.517 00:02:52 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:27.517 00:02:52 -- setup/common.sh@44 -- # parts=() 00:04:27.517 00:02:52 -- setup/common.sh@44 -- # local parts 00:04:27.517 00:02:52 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:27.517 00:02:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:27.517 00:02:52 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:27.517 00:02:52 -- setup/common.sh@46 -- # (( part++ )) 00:04:27.517 00:02:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:27.517 00:02:52 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:27.517 00:02:52 -- setup/common.sh@46 -- # (( part++ )) 00:04:27.517 00:02:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:27.517 00:02:52 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:27.517 00:02:52 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:27.517 00:02:52 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:28.459 Creating new GPT entries in memory. 00:04:28.459 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:28.459 other utilities. 00:04:28.459 00:02:53 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:28.459 00:02:53 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:28.459 00:02:53 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:28.459 00:02:53 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:28.459 00:02:53 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:29.398 Creating new GPT entries in memory. 00:04:29.398 The operation has completed successfully. 00:04:29.398 00:02:54 -- setup/common.sh@57 -- # (( part++ )) 00:04:29.398 00:02:54 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:29.398 00:02:54 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:29.398 00:02:54 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:29.398 00:02:54 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:30.336 The operation has completed successfully. 00:04:30.336 00:02:55 -- setup/common.sh@57 -- # (( part++ )) 00:04:30.336 00:02:55 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:30.336 00:02:55 -- setup/common.sh@62 -- # wait 2684571 00:04:30.595 00:02:55 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:30.595 00:02:55 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.595 00:02:55 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:30.596 00:02:55 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:30.596 00:02:55 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:30.596 00:02:55 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:30.596 00:02:55 -- setup/devices.sh@161 -- # break 00:04:30.596 00:02:55 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:30.596 00:02:55 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:30.596 00:02:55 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:30.596 00:02:55 -- setup/devices.sh@166 -- # dm=dm-0 00:04:30.596 00:02:55 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:30.596 00:02:55 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:30.596 00:02:55 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.596 00:02:55 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:30.596 00:02:55 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.596 00:02:55 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:30.596 00:02:55 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:30.596 00:02:56 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.596 00:02:56 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:30.596 00:02:56 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:30.596 00:02:56 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:30.596 00:02:56 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.596 00:02:56 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:30.596 00:02:56 -- setup/devices.sh@53 -- # local found=0 00:04:30.596 00:02:56 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:30.596 00:02:56 -- setup/devices.sh@56 -- # : 00:04:30.596 00:02:56 -- setup/devices.sh@59 -- # local pci status 00:04:30.596 00:02:56 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.596 00:02:56 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:30.596 00:02:56 -- setup/devices.sh@47 -- # setup output config 00:04:30.596 00:02:56 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.596 00:02:56 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:33.888 00:02:59 -- setup/devices.sh@63 -- # found=1 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:33.888 00:02:59 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:33.888 00:02:59 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:33.888 00:02:59 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:33.888 00:02:59 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:33.888 00:02:59 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:33.888 00:02:59 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:33.888 00:02:59 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:33.888 00:02:59 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:33.888 00:02:59 -- setup/devices.sh@50 -- # local mount_point= 00:04:33.888 00:02:59 -- setup/devices.sh@51 -- # local test_file= 00:04:33.888 00:02:59 -- setup/devices.sh@53 -- # local found=0 00:04:33.888 00:02:59 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:33.888 00:02:59 -- setup/devices.sh@59 -- # local pci status 00:04:33.888 00:02:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.888 00:02:59 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:33.888 00:02:59 -- setup/devices.sh@47 -- # setup output config 00:04:33.888 00:02:59 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.888 00:02:59 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:37.191 00:03:02 -- setup/devices.sh@63 -- # found=1 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.191 00:03:02 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:37.191 00:03:02 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:37.191 00:03:02 -- setup/devices.sh@68 -- # return 0 00:04:37.191 00:03:02 -- setup/devices.sh@187 -- # cleanup_dm 00:04:37.191 00:03:02 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:37.191 00:03:02 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:37.191 00:03:02 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:37.191 00:03:02 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:37.191 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:37.191 00:03:02 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:37.191 00:04:37.191 real 0m9.870s 00:04:37.191 user 0m2.362s 00:04:37.191 sys 0m4.541s 00:04:37.191 00:03:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:37.191 00:03:02 -- common/autotest_common.sh@10 -- # set +x 00:04:37.191 ************************************ 00:04:37.191 END TEST dm_mount 00:04:37.191 ************************************ 00:04:37.191 00:03:02 -- setup/devices.sh@1 -- # cleanup 00:04:37.191 00:03:02 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:37.191 00:03:02 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:37.191 00:03:02 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:37.191 00:03:02 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:37.191 00:03:02 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:37.450 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:37.450 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:37.450 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:37.450 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:37.450 00:03:02 -- setup/devices.sh@12 -- # cleanup_dm 00:04:37.450 00:03:02 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:37.450 00:03:02 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:37.450 00:03:02 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:37.450 00:03:02 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:37.450 00:03:02 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:37.450 00:03:02 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:37.450 00:04:37.450 real 0m26.661s 00:04:37.450 user 0m7.342s 00:04:37.450 sys 0m14.169s 00:04:37.450 00:03:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:37.450 00:03:02 -- common/autotest_common.sh@10 -- # set +x 00:04:37.450 ************************************ 00:04:37.450 END TEST devices 00:04:37.450 ************************************ 00:04:37.710 00:04:37.710 real 1m31.080s 00:04:37.710 user 0m27.494s 00:04:37.710 sys 0m52.206s 00:04:37.710 00:03:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:37.710 00:03:03 -- common/autotest_common.sh@10 -- # set +x 00:04:37.710 ************************************ 00:04:37.710 END TEST setup.sh 00:04:37.710 ************************************ 00:04:37.710 00:03:03 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:41.000 Hugepages 00:04:41.000 node hugesize free / total 00:04:41.000 node0 1048576kB 0 / 0 00:04:41.000 node0 2048kB 2048 / 2048 00:04:41.000 node1 1048576kB 0 / 0 00:04:41.000 node1 2048kB 0 / 0 00:04:41.000 00:04:41.000 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:41.000 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:41.000 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:41.000 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:41.000 00:03:06 -- spdk/autotest.sh@128 -- # uname -s 00:04:41.000 00:03:06 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:04:41.000 00:03:06 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:04:41.000 00:03:06 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:44.308 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:44.308 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:46.214 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:46.214 00:03:11 -- common/autotest_common.sh@1527 -- # sleep 1 00:04:47.152 00:03:12 -- common/autotest_common.sh@1528 -- # bdfs=() 00:04:47.152 00:03:12 -- common/autotest_common.sh@1528 -- # local bdfs 00:04:47.152 00:03:12 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:04:47.152 00:03:12 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:04:47.152 00:03:12 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:47.152 00:03:12 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:47.152 00:03:12 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:47.152 00:03:12 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:47.152 00:03:12 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:47.152 00:03:12 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:47.152 00:03:12 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:47.152 00:03:12 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:50.448 Waiting for block devices as requested 00:04:50.448 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:50.448 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:50.448 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:50.448 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:50.448 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:50.448 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:50.448 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:50.706 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:50.706 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:50.706 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:50.965 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:50.965 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:50.965 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:51.225 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:51.225 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:51.225 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:51.484 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:51.484 00:03:16 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:04:51.484 00:03:16 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:51.484 00:03:16 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:04:51.484 00:03:16 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:04:51.484 00:03:16 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:51.484 00:03:16 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:51.484 00:03:16 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:51.484 00:03:16 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:04:51.484 00:03:17 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:04:51.484 00:03:17 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:04:51.484 00:03:17 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:04:51.484 00:03:17 -- common/autotest_common.sh@1540 -- # grep oacs 00:04:51.484 00:03:17 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:51.484 00:03:17 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:04:51.484 00:03:17 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:04:51.484 00:03:17 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:04:51.484 00:03:17 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:04:51.484 00:03:17 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:04:51.484 00:03:17 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:04:51.484 00:03:17 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:04:51.484 00:03:17 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:04:51.484 00:03:17 -- common/autotest_common.sh@1552 -- # continue 00:04:51.484 00:03:17 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:04:51.484 00:03:17 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:51.484 00:03:17 -- common/autotest_common.sh@10 -- # set +x 00:04:51.744 00:03:17 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:04:51.744 00:03:17 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:51.744 00:03:17 -- common/autotest_common.sh@10 -- # set +x 00:04:51.744 00:03:17 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:55.033 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:55.033 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:56.954 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:56.954 00:03:22 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:04:56.954 00:03:22 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:56.954 00:03:22 -- common/autotest_common.sh@10 -- # set +x 00:04:56.954 00:03:22 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:04:56.954 00:03:22 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:04:56.954 00:03:22 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:04:56.954 00:03:22 -- common/autotest_common.sh@1572 -- # bdfs=() 00:04:56.954 00:03:22 -- common/autotest_common.sh@1572 -- # local bdfs 00:04:56.954 00:03:22 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:04:56.954 00:03:22 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:56.954 00:03:22 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:56.954 00:03:22 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:56.954 00:03:22 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:56.954 00:03:22 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:56.954 00:03:22 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:56.954 00:03:22 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:56.954 00:03:22 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:04:56.954 00:03:22 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:56.954 00:03:22 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:04:56.954 00:03:22 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:56.954 00:03:22 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:04:56.954 00:03:22 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:04:56.954 00:03:22 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:04:56.954 00:03:22 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=2694796 00:04:56.954 00:03:22 -- common/autotest_common.sh@1593 -- # waitforlisten 2694796 00:04:56.954 00:03:22 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:56.954 00:03:22 -- common/autotest_common.sh@829 -- # '[' -z 2694796 ']' 00:04:56.954 00:03:22 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:56.954 00:03:22 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:56.954 00:03:22 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:56.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:56.954 00:03:22 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:56.954 00:03:22 -- common/autotest_common.sh@10 -- # set +x 00:04:56.954 [2024-11-30 00:03:22.287545] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:56.954 [2024-11-30 00:03:22.287612] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694796 ] 00:04:56.954 EAL: No free 2048 kB hugepages reported on node 1 00:04:56.954 [2024-11-30 00:03:22.355513] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:56.954 [2024-11-30 00:03:22.430117] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:56.954 [2024-11-30 00:03:22.430262] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:57.889 00:03:23 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:57.889 00:03:23 -- common/autotest_common.sh@862 -- # return 0 00:04:57.889 00:03:23 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:04:57.889 00:03:23 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:04:57.889 00:03:23 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:01.185 nvme0n1 00:05:01.185 00:03:26 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:01.185 [2024-11-30 00:03:26.292030] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:01.185 request: 00:05:01.185 { 00:05:01.185 "nvme_ctrlr_name": "nvme0", 00:05:01.185 "password": "test", 00:05:01.185 "method": "bdev_nvme_opal_revert", 00:05:01.185 "req_id": 1 00:05:01.185 } 00:05:01.185 Got JSON-RPC error response 00:05:01.185 response: 00:05:01.185 { 00:05:01.185 "code": -32602, 00:05:01.185 "message": "Invalid parameters" 00:05:01.185 } 00:05:01.185 00:03:26 -- common/autotest_common.sh@1599 -- # true 00:05:01.185 00:03:26 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:01.185 00:03:26 -- common/autotest_common.sh@1603 -- # killprocess 2694796 00:05:01.185 00:03:26 -- common/autotest_common.sh@936 -- # '[' -z 2694796 ']' 00:05:01.185 00:03:26 -- common/autotest_common.sh@940 -- # kill -0 2694796 00:05:01.185 00:03:26 -- common/autotest_common.sh@941 -- # uname 00:05:01.185 00:03:26 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:01.185 00:03:26 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2694796 00:05:01.185 00:03:26 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:01.185 00:03:26 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:01.185 00:03:26 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2694796' 00:05:01.185 killing process with pid 2694796 00:05:01.185 00:03:26 -- common/autotest_common.sh@955 -- # kill 2694796 00:05:01.185 00:03:26 -- common/autotest_common.sh@960 -- # wait 2694796 00:05:03.087 00:03:28 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:03.087 00:03:28 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:03.087 00:03:28 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:03.087 00:03:28 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:03.087 00:03:28 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:03.087 00:03:28 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:03.087 00:03:28 -- common/autotest_common.sh@10 -- # set +x 00:05:03.087 00:03:28 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:03.087 00:03:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:03.087 00:03:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:03.087 00:03:28 -- common/autotest_common.sh@10 -- # set +x 00:05:03.087 ************************************ 00:05:03.087 START TEST env 00:05:03.087 ************************************ 00:05:03.087 00:03:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:03.346 * Looking for test storage... 00:05:03.346 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:03.346 00:03:28 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:03.346 00:03:28 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:03.346 00:03:28 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:03.346 00:03:28 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:03.346 00:03:28 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:03.346 00:03:28 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:03.346 00:03:28 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:03.346 00:03:28 -- scripts/common.sh@335 -- # IFS=.-: 00:05:03.346 00:03:28 -- scripts/common.sh@335 -- # read -ra ver1 00:05:03.347 00:03:28 -- scripts/common.sh@336 -- # IFS=.-: 00:05:03.347 00:03:28 -- scripts/common.sh@336 -- # read -ra ver2 00:05:03.347 00:03:28 -- scripts/common.sh@337 -- # local 'op=<' 00:05:03.347 00:03:28 -- scripts/common.sh@339 -- # ver1_l=2 00:05:03.347 00:03:28 -- scripts/common.sh@340 -- # ver2_l=1 00:05:03.347 00:03:28 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:03.347 00:03:28 -- scripts/common.sh@343 -- # case "$op" in 00:05:03.347 00:03:28 -- scripts/common.sh@344 -- # : 1 00:05:03.347 00:03:28 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:03.347 00:03:28 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:03.347 00:03:28 -- scripts/common.sh@364 -- # decimal 1 00:05:03.347 00:03:28 -- scripts/common.sh@352 -- # local d=1 00:05:03.347 00:03:28 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:03.347 00:03:28 -- scripts/common.sh@354 -- # echo 1 00:05:03.347 00:03:28 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:03.347 00:03:28 -- scripts/common.sh@365 -- # decimal 2 00:05:03.347 00:03:28 -- scripts/common.sh@352 -- # local d=2 00:05:03.347 00:03:28 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:03.347 00:03:28 -- scripts/common.sh@354 -- # echo 2 00:05:03.347 00:03:28 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:03.347 00:03:28 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:03.347 00:03:28 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:03.347 00:03:28 -- scripts/common.sh@367 -- # return 0 00:05:03.347 00:03:28 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:03.347 00:03:28 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:03.347 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.347 --rc genhtml_branch_coverage=1 00:05:03.347 --rc genhtml_function_coverage=1 00:05:03.347 --rc genhtml_legend=1 00:05:03.347 --rc geninfo_all_blocks=1 00:05:03.347 --rc geninfo_unexecuted_blocks=1 00:05:03.347 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.347 ' 00:05:03.347 00:03:28 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:03.347 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.347 --rc genhtml_branch_coverage=1 00:05:03.347 --rc genhtml_function_coverage=1 00:05:03.347 --rc genhtml_legend=1 00:05:03.347 --rc geninfo_all_blocks=1 00:05:03.347 --rc geninfo_unexecuted_blocks=1 00:05:03.347 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.347 ' 00:05:03.347 00:03:28 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:03.347 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.347 --rc genhtml_branch_coverage=1 00:05:03.347 --rc genhtml_function_coverage=1 00:05:03.347 --rc genhtml_legend=1 00:05:03.347 --rc geninfo_all_blocks=1 00:05:03.347 --rc geninfo_unexecuted_blocks=1 00:05:03.347 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.347 ' 00:05:03.347 00:03:28 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:03.347 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:03.347 --rc genhtml_branch_coverage=1 00:05:03.347 --rc genhtml_function_coverage=1 00:05:03.347 --rc genhtml_legend=1 00:05:03.347 --rc geninfo_all_blocks=1 00:05:03.347 --rc geninfo_unexecuted_blocks=1 00:05:03.347 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:03.347 ' 00:05:03.347 00:03:28 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:03.347 00:03:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:03.347 00:03:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:03.347 00:03:28 -- common/autotest_common.sh@10 -- # set +x 00:05:03.347 ************************************ 00:05:03.347 START TEST env_memory 00:05:03.347 ************************************ 00:05:03.347 00:03:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:03.347 00:05:03.347 00:05:03.347 CUnit - A unit testing framework for C - Version 2.1-3 00:05:03.347 http://cunit.sourceforge.net/ 00:05:03.347 00:05:03.347 00:05:03.347 Suite: memory 00:05:03.347 Test: alloc and free memory map ...[2024-11-30 00:03:28.825396] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:03.347 passed 00:05:03.347 Test: mem map translation ...[2024-11-30 00:03:28.838528] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:03.347 [2024-11-30 00:03:28.838550] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:03.347 [2024-11-30 00:03:28.838579] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:03.347 [2024-11-30 00:03:28.838588] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:03.347 passed 00:05:03.347 Test: mem map registration ...[2024-11-30 00:03:28.859237] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:03.347 [2024-11-30 00:03:28.859253] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:03.347 passed 00:05:03.347 Test: mem map adjacent registrations ...passed 00:05:03.347 00:05:03.347 Run Summary: Type Total Ran Passed Failed Inactive 00:05:03.347 suites 1 1 n/a 0 0 00:05:03.347 tests 4 4 4 0 0 00:05:03.347 asserts 152 152 152 0 n/a 00:05:03.347 00:05:03.347 Elapsed time = 0.083 seconds 00:05:03.347 00:05:03.347 real 0m0.097s 00:05:03.347 user 0m0.083s 00:05:03.347 sys 0m0.013s 00:05:03.347 00:03:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.347 00:03:28 -- common/autotest_common.sh@10 -- # set +x 00:05:03.347 ************************************ 00:05:03.347 END TEST env_memory 00:05:03.347 ************************************ 00:05:03.607 00:03:28 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:03.607 00:03:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:03.607 00:03:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:03.607 00:03:28 -- common/autotest_common.sh@10 -- # set +x 00:05:03.607 ************************************ 00:05:03.607 START TEST env_vtophys 00:05:03.607 ************************************ 00:05:03.607 00:03:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:03.608 EAL: lib.eal log level changed from notice to debug 00:05:03.608 EAL: Detected lcore 0 as core 0 on socket 0 00:05:03.608 EAL: Detected lcore 1 as core 1 on socket 0 00:05:03.608 EAL: Detected lcore 2 as core 2 on socket 0 00:05:03.608 EAL: Detected lcore 3 as core 3 on socket 0 00:05:03.608 EAL: Detected lcore 4 as core 4 on socket 0 00:05:03.608 EAL: Detected lcore 5 as core 5 on socket 0 00:05:03.608 EAL: Detected lcore 6 as core 6 on socket 0 00:05:03.608 EAL: Detected lcore 7 as core 8 on socket 0 00:05:03.608 EAL: Detected lcore 8 as core 9 on socket 0 00:05:03.608 EAL: Detected lcore 9 as core 10 on socket 0 00:05:03.608 EAL: Detected lcore 10 as core 11 on socket 0 00:05:03.608 EAL: Detected lcore 11 as core 12 on socket 0 00:05:03.608 EAL: Detected lcore 12 as core 13 on socket 0 00:05:03.608 EAL: Detected lcore 13 as core 14 on socket 0 00:05:03.608 EAL: Detected lcore 14 as core 16 on socket 0 00:05:03.608 EAL: Detected lcore 15 as core 17 on socket 0 00:05:03.608 EAL: Detected lcore 16 as core 18 on socket 0 00:05:03.608 EAL: Detected lcore 17 as core 19 on socket 0 00:05:03.608 EAL: Detected lcore 18 as core 20 on socket 0 00:05:03.608 EAL: Detected lcore 19 as core 21 on socket 0 00:05:03.608 EAL: Detected lcore 20 as core 22 on socket 0 00:05:03.608 EAL: Detected lcore 21 as core 24 on socket 0 00:05:03.608 EAL: Detected lcore 22 as core 25 on socket 0 00:05:03.608 EAL: Detected lcore 23 as core 26 on socket 0 00:05:03.608 EAL: Detected lcore 24 as core 27 on socket 0 00:05:03.608 EAL: Detected lcore 25 as core 28 on socket 0 00:05:03.608 EAL: Detected lcore 26 as core 29 on socket 0 00:05:03.608 EAL: Detected lcore 27 as core 30 on socket 0 00:05:03.608 EAL: Detected lcore 28 as core 0 on socket 1 00:05:03.608 EAL: Detected lcore 29 as core 1 on socket 1 00:05:03.608 EAL: Detected lcore 30 as core 2 on socket 1 00:05:03.608 EAL: Detected lcore 31 as core 3 on socket 1 00:05:03.608 EAL: Detected lcore 32 as core 4 on socket 1 00:05:03.608 EAL: Detected lcore 33 as core 5 on socket 1 00:05:03.608 EAL: Detected lcore 34 as core 6 on socket 1 00:05:03.608 EAL: Detected lcore 35 as core 8 on socket 1 00:05:03.608 EAL: Detected lcore 36 as core 9 on socket 1 00:05:03.608 EAL: Detected lcore 37 as core 10 on socket 1 00:05:03.608 EAL: Detected lcore 38 as core 11 on socket 1 00:05:03.608 EAL: Detected lcore 39 as core 12 on socket 1 00:05:03.608 EAL: Detected lcore 40 as core 13 on socket 1 00:05:03.608 EAL: Detected lcore 41 as core 14 on socket 1 00:05:03.608 EAL: Detected lcore 42 as core 16 on socket 1 00:05:03.608 EAL: Detected lcore 43 as core 17 on socket 1 00:05:03.608 EAL: Detected lcore 44 as core 18 on socket 1 00:05:03.608 EAL: Detected lcore 45 as core 19 on socket 1 00:05:03.608 EAL: Detected lcore 46 as core 20 on socket 1 00:05:03.608 EAL: Detected lcore 47 as core 21 on socket 1 00:05:03.608 EAL: Detected lcore 48 as core 22 on socket 1 00:05:03.608 EAL: Detected lcore 49 as core 24 on socket 1 00:05:03.608 EAL: Detected lcore 50 as core 25 on socket 1 00:05:03.608 EAL: Detected lcore 51 as core 26 on socket 1 00:05:03.608 EAL: Detected lcore 52 as core 27 on socket 1 00:05:03.608 EAL: Detected lcore 53 as core 28 on socket 1 00:05:03.608 EAL: Detected lcore 54 as core 29 on socket 1 00:05:03.608 EAL: Detected lcore 55 as core 30 on socket 1 00:05:03.608 EAL: Detected lcore 56 as core 0 on socket 0 00:05:03.608 EAL: Detected lcore 57 as core 1 on socket 0 00:05:03.608 EAL: Detected lcore 58 as core 2 on socket 0 00:05:03.608 EAL: Detected lcore 59 as core 3 on socket 0 00:05:03.608 EAL: Detected lcore 60 as core 4 on socket 0 00:05:03.608 EAL: Detected lcore 61 as core 5 on socket 0 00:05:03.608 EAL: Detected lcore 62 as core 6 on socket 0 00:05:03.608 EAL: Detected lcore 63 as core 8 on socket 0 00:05:03.608 EAL: Detected lcore 64 as core 9 on socket 0 00:05:03.608 EAL: Detected lcore 65 as core 10 on socket 0 00:05:03.608 EAL: Detected lcore 66 as core 11 on socket 0 00:05:03.608 EAL: Detected lcore 67 as core 12 on socket 0 00:05:03.608 EAL: Detected lcore 68 as core 13 on socket 0 00:05:03.608 EAL: Detected lcore 69 as core 14 on socket 0 00:05:03.608 EAL: Detected lcore 70 as core 16 on socket 0 00:05:03.608 EAL: Detected lcore 71 as core 17 on socket 0 00:05:03.608 EAL: Detected lcore 72 as core 18 on socket 0 00:05:03.608 EAL: Detected lcore 73 as core 19 on socket 0 00:05:03.608 EAL: Detected lcore 74 as core 20 on socket 0 00:05:03.608 EAL: Detected lcore 75 as core 21 on socket 0 00:05:03.608 EAL: Detected lcore 76 as core 22 on socket 0 00:05:03.608 EAL: Detected lcore 77 as core 24 on socket 0 00:05:03.608 EAL: Detected lcore 78 as core 25 on socket 0 00:05:03.608 EAL: Detected lcore 79 as core 26 on socket 0 00:05:03.608 EAL: Detected lcore 80 as core 27 on socket 0 00:05:03.608 EAL: Detected lcore 81 as core 28 on socket 0 00:05:03.608 EAL: Detected lcore 82 as core 29 on socket 0 00:05:03.608 EAL: Detected lcore 83 as core 30 on socket 0 00:05:03.608 EAL: Detected lcore 84 as core 0 on socket 1 00:05:03.608 EAL: Detected lcore 85 as core 1 on socket 1 00:05:03.608 EAL: Detected lcore 86 as core 2 on socket 1 00:05:03.608 EAL: Detected lcore 87 as core 3 on socket 1 00:05:03.608 EAL: Detected lcore 88 as core 4 on socket 1 00:05:03.608 EAL: Detected lcore 89 as core 5 on socket 1 00:05:03.608 EAL: Detected lcore 90 as core 6 on socket 1 00:05:03.608 EAL: Detected lcore 91 as core 8 on socket 1 00:05:03.608 EAL: Detected lcore 92 as core 9 on socket 1 00:05:03.608 EAL: Detected lcore 93 as core 10 on socket 1 00:05:03.608 EAL: Detected lcore 94 as core 11 on socket 1 00:05:03.608 EAL: Detected lcore 95 as core 12 on socket 1 00:05:03.608 EAL: Detected lcore 96 as core 13 on socket 1 00:05:03.608 EAL: Detected lcore 97 as core 14 on socket 1 00:05:03.608 EAL: Detected lcore 98 as core 16 on socket 1 00:05:03.608 EAL: Detected lcore 99 as core 17 on socket 1 00:05:03.608 EAL: Detected lcore 100 as core 18 on socket 1 00:05:03.608 EAL: Detected lcore 101 as core 19 on socket 1 00:05:03.608 EAL: Detected lcore 102 as core 20 on socket 1 00:05:03.608 EAL: Detected lcore 103 as core 21 on socket 1 00:05:03.608 EAL: Detected lcore 104 as core 22 on socket 1 00:05:03.608 EAL: Detected lcore 105 as core 24 on socket 1 00:05:03.608 EAL: Detected lcore 106 as core 25 on socket 1 00:05:03.608 EAL: Detected lcore 107 as core 26 on socket 1 00:05:03.608 EAL: Detected lcore 108 as core 27 on socket 1 00:05:03.608 EAL: Detected lcore 109 as core 28 on socket 1 00:05:03.608 EAL: Detected lcore 110 as core 29 on socket 1 00:05:03.608 EAL: Detected lcore 111 as core 30 on socket 1 00:05:03.608 EAL: Maximum logical cores by configuration: 128 00:05:03.608 EAL: Detected CPU lcores: 112 00:05:03.608 EAL: Detected NUMA nodes: 2 00:05:03.608 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:03.608 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:03.608 EAL: Checking presence of .so 'librte_eal.so' 00:05:03.608 EAL: Detected static linkage of DPDK 00:05:03.608 EAL: No shared files mode enabled, IPC will be disabled 00:05:03.608 EAL: Bus pci wants IOVA as 'DC' 00:05:03.608 EAL: Buses did not request a specific IOVA mode. 00:05:03.608 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:03.608 EAL: Selected IOVA mode 'VA' 00:05:03.608 EAL: No free 2048 kB hugepages reported on node 1 00:05:03.608 EAL: Probing VFIO support... 00:05:03.608 EAL: IOMMU type 1 (Type 1) is supported 00:05:03.608 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:03.608 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:03.608 EAL: VFIO support initialized 00:05:03.608 EAL: Ask a virtual area of 0x2e000 bytes 00:05:03.608 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:03.608 EAL: Setting up physically contiguous memory... 00:05:03.608 EAL: Setting maximum number of open files to 524288 00:05:03.608 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:03.608 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:03.608 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:03.608 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.608 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:03.608 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:03.608 EAL: Ask a virtual area of 0x61000 bytes 00:05:03.608 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:03.608 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:03.608 EAL: Ask a virtual area of 0x400000000 bytes 00:05:03.609 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:03.609 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:03.609 EAL: Hugepages will be freed exactly as allocated. 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: TSC frequency is ~2500000 KHz 00:05:03.609 EAL: Main lcore 0 is ready (tid=7fbaf773aa00;cpuset=[0]) 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 0 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 2MB 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Mem event callback 'spdk:(nil)' registered 00:05:03.609 00:05:03.609 00:05:03.609 CUnit - A unit testing framework for C - Version 2.1-3 00:05:03.609 http://cunit.sourceforge.net/ 00:05:03.609 00:05:03.609 00:05:03.609 Suite: components_suite 00:05:03.609 Test: vtophys_malloc_test ...passed 00:05:03.609 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 4MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 4MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 6MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 6MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 10MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 10MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 18MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 18MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 34MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 34MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 66MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 66MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.609 EAL: Restoring previous memory policy: 4 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was expanded by 130MB 00:05:03.609 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.609 EAL: request: mp_malloc_sync 00:05:03.609 EAL: No shared files mode enabled, IPC is disabled 00:05:03.609 EAL: Heap on socket 0 was shrunk by 130MB 00:05:03.609 EAL: Trying to obtain current memory policy. 00:05:03.609 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.868 EAL: Restoring previous memory policy: 4 00:05:03.868 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.868 EAL: request: mp_malloc_sync 00:05:03.868 EAL: No shared files mode enabled, IPC is disabled 00:05:03.868 EAL: Heap on socket 0 was expanded by 258MB 00:05:03.868 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.868 EAL: request: mp_malloc_sync 00:05:03.868 EAL: No shared files mode enabled, IPC is disabled 00:05:03.868 EAL: Heap on socket 0 was shrunk by 258MB 00:05:03.868 EAL: Trying to obtain current memory policy. 00:05:03.868 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:03.868 EAL: Restoring previous memory policy: 4 00:05:03.868 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.868 EAL: request: mp_malloc_sync 00:05:03.868 EAL: No shared files mode enabled, IPC is disabled 00:05:03.868 EAL: Heap on socket 0 was expanded by 514MB 00:05:04.127 EAL: Calling mem event callback 'spdk:(nil)' 00:05:04.127 EAL: request: mp_malloc_sync 00:05:04.127 EAL: No shared files mode enabled, IPC is disabled 00:05:04.127 EAL: Heap on socket 0 was shrunk by 514MB 00:05:04.127 EAL: Trying to obtain current memory policy. 00:05:04.127 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:04.387 EAL: Restoring previous memory policy: 4 00:05:04.387 EAL: Calling mem event callback 'spdk:(nil)' 00:05:04.387 EAL: request: mp_malloc_sync 00:05:04.387 EAL: No shared files mode enabled, IPC is disabled 00:05:04.387 EAL: Heap on socket 0 was expanded by 1026MB 00:05:04.387 EAL: Calling mem event callback 'spdk:(nil)' 00:05:04.647 EAL: request: mp_malloc_sync 00:05:04.647 EAL: No shared files mode enabled, IPC is disabled 00:05:04.647 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:04.647 passed 00:05:04.647 00:05:04.647 Run Summary: Type Total Ran Passed Failed Inactive 00:05:04.647 suites 1 1 n/a 0 0 00:05:04.647 tests 2 2 2 0 0 00:05:04.647 asserts 497 497 497 0 n/a 00:05:04.647 00:05:04.647 Elapsed time = 0.963 seconds 00:05:04.647 EAL: Calling mem event callback 'spdk:(nil)' 00:05:04.647 EAL: request: mp_malloc_sync 00:05:04.647 EAL: No shared files mode enabled, IPC is disabled 00:05:04.647 EAL: Heap on socket 0 was shrunk by 2MB 00:05:04.647 EAL: No shared files mode enabled, IPC is disabled 00:05:04.647 EAL: No shared files mode enabled, IPC is disabled 00:05:04.647 EAL: No shared files mode enabled, IPC is disabled 00:05:04.647 00:05:04.647 real 0m1.088s 00:05:04.647 user 0m0.631s 00:05:04.647 sys 0m0.424s 00:05:04.647 00:03:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:04.647 00:03:30 -- common/autotest_common.sh@10 -- # set +x 00:05:04.647 ************************************ 00:05:04.647 END TEST env_vtophys 00:05:04.647 ************************************ 00:05:04.647 00:03:30 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:04.647 00:03:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:04.647 00:03:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:04.647 00:03:30 -- common/autotest_common.sh@10 -- # set +x 00:05:04.647 ************************************ 00:05:04.647 START TEST env_pci 00:05:04.647 ************************************ 00:05:04.647 00:03:30 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:04.647 00:05:04.647 00:05:04.647 CUnit - A unit testing framework for C - Version 2.1-3 00:05:04.647 http://cunit.sourceforge.net/ 00:05:04.647 00:05:04.647 00:05:04.647 Suite: pci 00:05:04.647 Test: pci_hook ...[2024-11-30 00:03:30.088159] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2696357 has claimed it 00:05:04.647 EAL: Cannot find device (10000:00:01.0) 00:05:04.647 EAL: Failed to attach device on primary process 00:05:04.647 passed 00:05:04.647 00:05:04.647 Run Summary: Type Total Ran Passed Failed Inactive 00:05:04.647 suites 1 1 n/a 0 0 00:05:04.647 tests 1 1 1 0 0 00:05:04.647 asserts 25 25 25 0 n/a 00:05:04.647 00:05:04.647 Elapsed time = 0.035 seconds 00:05:04.647 00:05:04.647 real 0m0.054s 00:05:04.647 user 0m0.015s 00:05:04.647 sys 0m0.039s 00:05:04.647 00:03:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:04.647 00:03:30 -- common/autotest_common.sh@10 -- # set +x 00:05:04.647 ************************************ 00:05:04.647 END TEST env_pci 00:05:04.647 ************************************ 00:05:04.647 00:03:30 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:04.647 00:03:30 -- env/env.sh@15 -- # uname 00:05:04.647 00:03:30 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:04.647 00:03:30 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:04.647 00:03:30 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:04.647 00:03:30 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:04.647 00:03:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:04.647 00:03:30 -- common/autotest_common.sh@10 -- # set +x 00:05:04.647 ************************************ 00:05:04.647 START TEST env_dpdk_post_init 00:05:04.647 ************************************ 00:05:04.647 00:03:30 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:04.907 EAL: Detected CPU lcores: 112 00:05:04.907 EAL: Detected NUMA nodes: 2 00:05:04.907 EAL: Detected static linkage of DPDK 00:05:04.907 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:04.907 EAL: Selected IOVA mode 'VA' 00:05:04.907 EAL: No free 2048 kB hugepages reported on node 1 00:05:04.907 EAL: VFIO support initialized 00:05:04.907 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:04.907 EAL: Using IOMMU type 1 (Type 1) 00:05:05.846 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:09.133 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:09.133 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:09.392 Starting DPDK initialization... 00:05:09.392 Starting SPDK post initialization... 00:05:09.392 SPDK NVMe probe 00:05:09.392 Attaching to 0000:d8:00.0 00:05:09.392 Attached to 0000:d8:00.0 00:05:09.392 Cleaning up... 00:05:09.392 00:05:09.392 real 0m4.699s 00:05:09.392 user 0m3.558s 00:05:09.392 sys 0m0.387s 00:05:09.392 00:03:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:09.392 00:03:34 -- common/autotest_common.sh@10 -- # set +x 00:05:09.392 ************************************ 00:05:09.392 END TEST env_dpdk_post_init 00:05:09.392 ************************************ 00:05:09.392 00:03:34 -- env/env.sh@26 -- # uname 00:05:09.392 00:03:34 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:09.392 00:03:34 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:09.392 00:03:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:09.392 00:03:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:09.392 00:03:34 -- common/autotest_common.sh@10 -- # set +x 00:05:09.392 ************************************ 00:05:09.392 START TEST env_mem_callbacks 00:05:09.392 ************************************ 00:05:09.392 00:03:34 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:09.658 EAL: Detected CPU lcores: 112 00:05:09.658 EAL: Detected NUMA nodes: 2 00:05:09.658 EAL: Detected static linkage of DPDK 00:05:09.658 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:09.658 EAL: Selected IOVA mode 'VA' 00:05:09.658 EAL: No free 2048 kB hugepages reported on node 1 00:05:09.658 EAL: VFIO support initialized 00:05:09.658 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:09.658 00:05:09.658 00:05:09.658 CUnit - A unit testing framework for C - Version 2.1-3 00:05:09.658 http://cunit.sourceforge.net/ 00:05:09.658 00:05:09.658 00:05:09.658 Suite: memory 00:05:09.658 Test: test ... 00:05:09.658 register 0x200000200000 2097152 00:05:09.658 malloc 3145728 00:05:09.658 register 0x200000400000 4194304 00:05:09.658 buf 0x200000500000 len 3145728 PASSED 00:05:09.658 malloc 64 00:05:09.658 buf 0x2000004fff40 len 64 PASSED 00:05:09.658 malloc 4194304 00:05:09.658 register 0x200000800000 6291456 00:05:09.658 buf 0x200000a00000 len 4194304 PASSED 00:05:09.658 free 0x200000500000 3145728 00:05:09.658 free 0x2000004fff40 64 00:05:09.658 unregister 0x200000400000 4194304 PASSED 00:05:09.658 free 0x200000a00000 4194304 00:05:09.658 unregister 0x200000800000 6291456 PASSED 00:05:09.658 malloc 8388608 00:05:09.658 register 0x200000400000 10485760 00:05:09.658 buf 0x200000600000 len 8388608 PASSED 00:05:09.658 free 0x200000600000 8388608 00:05:09.658 unregister 0x200000400000 10485760 PASSED 00:05:09.658 passed 00:05:09.658 00:05:09.658 Run Summary: Type Total Ran Passed Failed Inactive 00:05:09.658 suites 1 1 n/a 0 0 00:05:09.658 tests 1 1 1 0 0 00:05:09.658 asserts 15 15 15 0 n/a 00:05:09.658 00:05:09.658 Elapsed time = 0.005 seconds 00:05:09.658 00:05:09.658 real 0m0.065s 00:05:09.658 user 0m0.021s 00:05:09.658 sys 0m0.044s 00:05:09.658 00:03:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:09.658 00:03:34 -- common/autotest_common.sh@10 -- # set +x 00:05:09.658 ************************************ 00:05:09.658 END TEST env_mem_callbacks 00:05:09.658 ************************************ 00:05:09.658 00:05:09.658 real 0m6.442s 00:05:09.658 user 0m4.484s 00:05:09.658 sys 0m1.228s 00:05:09.658 00:03:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:09.658 00:03:35 -- common/autotest_common.sh@10 -- # set +x 00:05:09.658 ************************************ 00:05:09.658 END TEST env 00:05:09.658 ************************************ 00:05:09.658 00:03:35 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:09.658 00:03:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:09.658 00:03:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:09.658 00:03:35 -- common/autotest_common.sh@10 -- # set +x 00:05:09.658 ************************************ 00:05:09.658 START TEST rpc 00:05:09.658 ************************************ 00:05:09.658 00:03:35 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:09.658 * Looking for test storage... 00:05:09.658 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:09.658 00:03:35 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:09.658 00:03:35 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:09.658 00:03:35 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:09.917 00:03:35 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:09.917 00:03:35 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:09.917 00:03:35 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:09.917 00:03:35 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:09.917 00:03:35 -- scripts/common.sh@335 -- # IFS=.-: 00:05:09.917 00:03:35 -- scripts/common.sh@335 -- # read -ra ver1 00:05:09.917 00:03:35 -- scripts/common.sh@336 -- # IFS=.-: 00:05:09.917 00:03:35 -- scripts/common.sh@336 -- # read -ra ver2 00:05:09.917 00:03:35 -- scripts/common.sh@337 -- # local 'op=<' 00:05:09.917 00:03:35 -- scripts/common.sh@339 -- # ver1_l=2 00:05:09.917 00:03:35 -- scripts/common.sh@340 -- # ver2_l=1 00:05:09.917 00:03:35 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:09.917 00:03:35 -- scripts/common.sh@343 -- # case "$op" in 00:05:09.917 00:03:35 -- scripts/common.sh@344 -- # : 1 00:05:09.917 00:03:35 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:09.917 00:03:35 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:09.917 00:03:35 -- scripts/common.sh@364 -- # decimal 1 00:05:09.917 00:03:35 -- scripts/common.sh@352 -- # local d=1 00:05:09.917 00:03:35 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:09.917 00:03:35 -- scripts/common.sh@354 -- # echo 1 00:05:09.917 00:03:35 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:09.917 00:03:35 -- scripts/common.sh@365 -- # decimal 2 00:05:09.917 00:03:35 -- scripts/common.sh@352 -- # local d=2 00:05:09.917 00:03:35 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:09.917 00:03:35 -- scripts/common.sh@354 -- # echo 2 00:05:09.917 00:03:35 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:09.917 00:03:35 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:09.917 00:03:35 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:09.917 00:03:35 -- scripts/common.sh@367 -- # return 0 00:05:09.917 00:03:35 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:09.917 00:03:35 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:09.917 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.917 --rc genhtml_branch_coverage=1 00:05:09.917 --rc genhtml_function_coverage=1 00:05:09.917 --rc genhtml_legend=1 00:05:09.917 --rc geninfo_all_blocks=1 00:05:09.917 --rc geninfo_unexecuted_blocks=1 00:05:09.917 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.917 ' 00:05:09.917 00:03:35 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:09.917 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.917 --rc genhtml_branch_coverage=1 00:05:09.917 --rc genhtml_function_coverage=1 00:05:09.917 --rc genhtml_legend=1 00:05:09.917 --rc geninfo_all_blocks=1 00:05:09.917 --rc geninfo_unexecuted_blocks=1 00:05:09.917 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.917 ' 00:05:09.917 00:03:35 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:09.917 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.917 --rc genhtml_branch_coverage=1 00:05:09.917 --rc genhtml_function_coverage=1 00:05:09.917 --rc genhtml_legend=1 00:05:09.917 --rc geninfo_all_blocks=1 00:05:09.917 --rc geninfo_unexecuted_blocks=1 00:05:09.917 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.917 ' 00:05:09.917 00:03:35 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:09.917 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.917 --rc genhtml_branch_coverage=1 00:05:09.917 --rc genhtml_function_coverage=1 00:05:09.917 --rc genhtml_legend=1 00:05:09.917 --rc geninfo_all_blocks=1 00:05:09.917 --rc geninfo_unexecuted_blocks=1 00:05:09.917 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.917 ' 00:05:09.917 00:03:35 -- rpc/rpc.sh@65 -- # spdk_pid=2697284 00:05:09.917 00:03:35 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:09.917 00:03:35 -- rpc/rpc.sh@67 -- # waitforlisten 2697284 00:05:09.917 00:03:35 -- common/autotest_common.sh@829 -- # '[' -z 2697284 ']' 00:05:09.917 00:03:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:09.917 00:03:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:09.917 00:03:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:09.917 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:09.917 00:03:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:09.917 00:03:35 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:09.917 00:03:35 -- common/autotest_common.sh@10 -- # set +x 00:05:09.917 [2024-11-30 00:03:35.279776] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:09.917 [2024-11-30 00:03:35.279840] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2697284 ] 00:05:09.917 EAL: No free 2048 kB hugepages reported on node 1 00:05:09.917 [2024-11-30 00:03:35.346187] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:09.917 [2024-11-30 00:03:35.420787] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:09.917 [2024-11-30 00:03:35.420925] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:09.917 [2024-11-30 00:03:35.420942] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2697284' to capture a snapshot of events at runtime. 00:05:09.917 [2024-11-30 00:03:35.420956] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2697284 for offline analysis/debug. 00:05:09.917 [2024-11-30 00:03:35.420980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:10.542 00:03:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:10.542 00:03:36 -- common/autotest_common.sh@862 -- # return 0 00:05:10.542 00:03:36 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:10.542 00:03:36 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:10.542 00:03:36 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:10.857 00:03:36 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:10.857 00:03:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:10.857 00:03:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:10.857 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.857 ************************************ 00:05:10.857 START TEST rpc_integrity 00:05:10.857 ************************************ 00:05:10.857 00:03:36 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:10.857 00:03:36 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:10.857 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.857 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.857 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.857 00:03:36 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:10.857 00:03:36 -- rpc/rpc.sh@13 -- # jq length 00:05:10.857 00:03:36 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:10.857 00:03:36 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:10.857 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.857 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.857 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.857 00:03:36 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:10.857 00:03:36 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:10.857 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.857 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.857 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.857 00:03:36 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:10.857 { 00:05:10.857 "name": "Malloc0", 00:05:10.857 "aliases": [ 00:05:10.857 "80f559ac-5589-447e-9b03-f3dba0199233" 00:05:10.857 ], 00:05:10.857 "product_name": "Malloc disk", 00:05:10.857 "block_size": 512, 00:05:10.857 "num_blocks": 16384, 00:05:10.857 "uuid": "80f559ac-5589-447e-9b03-f3dba0199233", 00:05:10.857 "assigned_rate_limits": { 00:05:10.857 "rw_ios_per_sec": 0, 00:05:10.857 "rw_mbytes_per_sec": 0, 00:05:10.857 "r_mbytes_per_sec": 0, 00:05:10.857 "w_mbytes_per_sec": 0 00:05:10.857 }, 00:05:10.857 "claimed": false, 00:05:10.857 "zoned": false, 00:05:10.857 "supported_io_types": { 00:05:10.857 "read": true, 00:05:10.857 "write": true, 00:05:10.857 "unmap": true, 00:05:10.857 "write_zeroes": true, 00:05:10.857 "flush": true, 00:05:10.857 "reset": true, 00:05:10.857 "compare": false, 00:05:10.857 "compare_and_write": false, 00:05:10.857 "abort": true, 00:05:10.857 "nvme_admin": false, 00:05:10.857 "nvme_io": false 00:05:10.857 }, 00:05:10.857 "memory_domains": [ 00:05:10.857 { 00:05:10.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:10.857 "dma_device_type": 2 00:05:10.857 } 00:05:10.857 ], 00:05:10.857 "driver_specific": {} 00:05:10.857 } 00:05:10.857 ]' 00:05:10.857 00:03:36 -- rpc/rpc.sh@17 -- # jq length 00:05:10.857 00:03:36 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:10.857 00:03:36 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:10.857 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.857 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.857 [2024-11-30 00:03:36.223834] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:10.857 [2024-11-30 00:03:36.223873] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:10.857 [2024-11-30 00:03:36.223903] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4cf3030 00:05:10.857 [2024-11-30 00:03:36.223917] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:10.857 [2024-11-30 00:03:36.224792] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:10.857 [2024-11-30 00:03:36.224817] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:10.857 Passthru0 00:05:10.857 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.857 00:03:36 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:10.857 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.857 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.857 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.857 00:03:36 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:10.857 { 00:05:10.857 "name": "Malloc0", 00:05:10.857 "aliases": [ 00:05:10.857 "80f559ac-5589-447e-9b03-f3dba0199233" 00:05:10.857 ], 00:05:10.857 "product_name": "Malloc disk", 00:05:10.857 "block_size": 512, 00:05:10.857 "num_blocks": 16384, 00:05:10.857 "uuid": "80f559ac-5589-447e-9b03-f3dba0199233", 00:05:10.857 "assigned_rate_limits": { 00:05:10.857 "rw_ios_per_sec": 0, 00:05:10.857 "rw_mbytes_per_sec": 0, 00:05:10.857 "r_mbytes_per_sec": 0, 00:05:10.857 "w_mbytes_per_sec": 0 00:05:10.857 }, 00:05:10.857 "claimed": true, 00:05:10.857 "claim_type": "exclusive_write", 00:05:10.857 "zoned": false, 00:05:10.857 "supported_io_types": { 00:05:10.857 "read": true, 00:05:10.857 "write": true, 00:05:10.857 "unmap": true, 00:05:10.857 "write_zeroes": true, 00:05:10.857 "flush": true, 00:05:10.857 "reset": true, 00:05:10.857 "compare": false, 00:05:10.857 "compare_and_write": false, 00:05:10.857 "abort": true, 00:05:10.857 "nvme_admin": false, 00:05:10.857 "nvme_io": false 00:05:10.857 }, 00:05:10.857 "memory_domains": [ 00:05:10.857 { 00:05:10.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:10.857 "dma_device_type": 2 00:05:10.857 } 00:05:10.857 ], 00:05:10.857 "driver_specific": {} 00:05:10.857 }, 00:05:10.857 { 00:05:10.857 "name": "Passthru0", 00:05:10.857 "aliases": [ 00:05:10.857 "b45f7c1b-e409-5347-88ed-645356592107" 00:05:10.857 ], 00:05:10.857 "product_name": "passthru", 00:05:10.857 "block_size": 512, 00:05:10.857 "num_blocks": 16384, 00:05:10.857 "uuid": "b45f7c1b-e409-5347-88ed-645356592107", 00:05:10.857 "assigned_rate_limits": { 00:05:10.857 "rw_ios_per_sec": 0, 00:05:10.857 "rw_mbytes_per_sec": 0, 00:05:10.857 "r_mbytes_per_sec": 0, 00:05:10.857 "w_mbytes_per_sec": 0 00:05:10.857 }, 00:05:10.857 "claimed": false, 00:05:10.857 "zoned": false, 00:05:10.857 "supported_io_types": { 00:05:10.857 "read": true, 00:05:10.857 "write": true, 00:05:10.857 "unmap": true, 00:05:10.857 "write_zeroes": true, 00:05:10.857 "flush": true, 00:05:10.857 "reset": true, 00:05:10.857 "compare": false, 00:05:10.857 "compare_and_write": false, 00:05:10.857 "abort": true, 00:05:10.857 "nvme_admin": false, 00:05:10.857 "nvme_io": false 00:05:10.857 }, 00:05:10.857 "memory_domains": [ 00:05:10.857 { 00:05:10.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:10.857 "dma_device_type": 2 00:05:10.857 } 00:05:10.857 ], 00:05:10.857 "driver_specific": { 00:05:10.858 "passthru": { 00:05:10.858 "name": "Passthru0", 00:05:10.858 "base_bdev_name": "Malloc0" 00:05:10.858 } 00:05:10.858 } 00:05:10.858 } 00:05:10.858 ]' 00:05:10.858 00:03:36 -- rpc/rpc.sh@21 -- # jq length 00:05:10.858 00:03:36 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:10.858 00:03:36 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:10.858 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.858 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.858 00:03:36 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:10.858 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.858 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.858 00:03:36 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:10.858 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:10.858 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:10.858 00:03:36 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:10.858 00:03:36 -- rpc/rpc.sh@26 -- # jq length 00:05:10.858 00:03:36 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:10.858 00:05:10.858 real 0m0.246s 00:05:10.858 user 0m0.154s 00:05:10.858 sys 0m0.031s 00:05:10.858 00:03:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:10.858 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 ************************************ 00:05:10.858 END TEST rpc_integrity 00:05:10.858 ************************************ 00:05:10.858 00:03:36 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:10.858 00:03:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:10.858 00:03:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:10.858 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.168 ************************************ 00:05:11.168 START TEST rpc_plugins 00:05:11.168 ************************************ 00:05:11.168 00:03:36 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:11.168 00:03:36 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:11.168 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.168 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.168 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.168 00:03:36 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:11.168 00:03:36 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:11.168 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.168 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.168 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.168 00:03:36 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:11.168 { 00:05:11.168 "name": "Malloc1", 00:05:11.168 "aliases": [ 00:05:11.168 "9c0bb24c-8168-43c3-8f27-4cb349b9c71d" 00:05:11.168 ], 00:05:11.168 "product_name": "Malloc disk", 00:05:11.168 "block_size": 4096, 00:05:11.168 "num_blocks": 256, 00:05:11.168 "uuid": "9c0bb24c-8168-43c3-8f27-4cb349b9c71d", 00:05:11.168 "assigned_rate_limits": { 00:05:11.168 "rw_ios_per_sec": 0, 00:05:11.168 "rw_mbytes_per_sec": 0, 00:05:11.168 "r_mbytes_per_sec": 0, 00:05:11.168 "w_mbytes_per_sec": 0 00:05:11.168 }, 00:05:11.168 "claimed": false, 00:05:11.168 "zoned": false, 00:05:11.168 "supported_io_types": { 00:05:11.168 "read": true, 00:05:11.168 "write": true, 00:05:11.168 "unmap": true, 00:05:11.168 "write_zeroes": true, 00:05:11.168 "flush": true, 00:05:11.168 "reset": true, 00:05:11.168 "compare": false, 00:05:11.169 "compare_and_write": false, 00:05:11.169 "abort": true, 00:05:11.169 "nvme_admin": false, 00:05:11.169 "nvme_io": false 00:05:11.169 }, 00:05:11.169 "memory_domains": [ 00:05:11.169 { 00:05:11.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:11.169 "dma_device_type": 2 00:05:11.169 } 00:05:11.169 ], 00:05:11.169 "driver_specific": {} 00:05:11.169 } 00:05:11.169 ]' 00:05:11.169 00:03:36 -- rpc/rpc.sh@32 -- # jq length 00:05:11.169 00:03:36 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:11.169 00:03:36 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:11.169 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.169 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.169 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.169 00:03:36 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:11.169 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.169 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.169 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.169 00:03:36 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:11.169 00:03:36 -- rpc/rpc.sh@36 -- # jq length 00:05:11.169 00:03:36 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:11.169 00:05:11.169 real 0m0.131s 00:05:11.169 user 0m0.081s 00:05:11.169 sys 0m0.020s 00:05:11.169 00:03:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:11.169 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.169 ************************************ 00:05:11.169 END TEST rpc_plugins 00:05:11.169 ************************************ 00:05:11.169 00:03:36 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:11.169 00:03:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:11.169 00:03:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:11.169 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.169 ************************************ 00:05:11.169 START TEST rpc_trace_cmd_test 00:05:11.169 ************************************ 00:05:11.169 00:03:36 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:11.169 00:03:36 -- rpc/rpc.sh@40 -- # local info 00:05:11.169 00:03:36 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:11.169 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.169 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.169 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.169 00:03:36 -- rpc/rpc.sh@42 -- # info='{ 00:05:11.169 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2697284", 00:05:11.169 "tpoint_group_mask": "0x8", 00:05:11.169 "iscsi_conn": { 00:05:11.169 "mask": "0x2", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "scsi": { 00:05:11.169 "mask": "0x4", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "bdev": { 00:05:11.169 "mask": "0x8", 00:05:11.169 "tpoint_mask": "0xffffffffffffffff" 00:05:11.169 }, 00:05:11.169 "nvmf_rdma": { 00:05:11.169 "mask": "0x10", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "nvmf_tcp": { 00:05:11.169 "mask": "0x20", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "ftl": { 00:05:11.169 "mask": "0x40", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "blobfs": { 00:05:11.169 "mask": "0x80", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "dsa": { 00:05:11.169 "mask": "0x200", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "thread": { 00:05:11.169 "mask": "0x400", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "nvme_pcie": { 00:05:11.169 "mask": "0x800", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "iaa": { 00:05:11.169 "mask": "0x1000", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "nvme_tcp": { 00:05:11.169 "mask": "0x2000", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 }, 00:05:11.169 "bdev_nvme": { 00:05:11.169 "mask": "0x4000", 00:05:11.169 "tpoint_mask": "0x0" 00:05:11.169 } 00:05:11.169 }' 00:05:11.169 00:03:36 -- rpc/rpc.sh@43 -- # jq length 00:05:11.169 00:03:36 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:11.169 00:03:36 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:11.169 00:03:36 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:11.169 00:03:36 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:11.491 00:03:36 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:11.491 00:03:36 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:11.491 00:03:36 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:11.491 00:03:36 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:11.491 00:03:36 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:11.491 00:05:11.491 real 0m0.236s 00:05:11.491 user 0m0.188s 00:05:11.491 sys 0m0.040s 00:05:11.491 00:03:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 ************************************ 00:05:11.491 END TEST rpc_trace_cmd_test 00:05:11.491 ************************************ 00:05:11.491 00:03:36 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:11.491 00:03:36 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:11.491 00:03:36 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:11.491 00:03:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:11.491 00:03:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 ************************************ 00:05:11.491 START TEST rpc_daemon_integrity 00:05:11.491 ************************************ 00:05:11.491 00:03:36 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:11.491 00:03:36 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:11.491 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.491 00:03:36 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:11.491 00:03:36 -- rpc/rpc.sh@13 -- # jq length 00:05:11.491 00:03:36 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:11.491 00:03:36 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:11.491 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.491 00:03:36 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:11.491 00:03:36 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:11.491 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.491 00:03:36 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:11.491 { 00:05:11.491 "name": "Malloc2", 00:05:11.491 "aliases": [ 00:05:11.491 "4e16fd5b-bb91-421e-9f64-93dae7b555a0" 00:05:11.491 ], 00:05:11.491 "product_name": "Malloc disk", 00:05:11.491 "block_size": 512, 00:05:11.491 "num_blocks": 16384, 00:05:11.491 "uuid": "4e16fd5b-bb91-421e-9f64-93dae7b555a0", 00:05:11.491 "assigned_rate_limits": { 00:05:11.491 "rw_ios_per_sec": 0, 00:05:11.491 "rw_mbytes_per_sec": 0, 00:05:11.491 "r_mbytes_per_sec": 0, 00:05:11.491 "w_mbytes_per_sec": 0 00:05:11.491 }, 00:05:11.491 "claimed": false, 00:05:11.491 "zoned": false, 00:05:11.491 "supported_io_types": { 00:05:11.491 "read": true, 00:05:11.491 "write": true, 00:05:11.491 "unmap": true, 00:05:11.491 "write_zeroes": true, 00:05:11.491 "flush": true, 00:05:11.491 "reset": true, 00:05:11.491 "compare": false, 00:05:11.491 "compare_and_write": false, 00:05:11.491 "abort": true, 00:05:11.491 "nvme_admin": false, 00:05:11.491 "nvme_io": false 00:05:11.491 }, 00:05:11.491 "memory_domains": [ 00:05:11.491 { 00:05:11.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:11.491 "dma_device_type": 2 00:05:11.491 } 00:05:11.491 ], 00:05:11.491 "driver_specific": {} 00:05:11.491 } 00:05:11.491 ]' 00:05:11.491 00:03:36 -- rpc/rpc.sh@17 -- # jq length 00:05:11.491 00:03:36 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:11.491 00:03:36 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:11.491 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 [2024-11-30 00:03:36.973782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:11.491 [2024-11-30 00:03:36.973817] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:11.491 [2024-11-30 00:03:36.973843] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4e7c980 00:05:11.491 [2024-11-30 00:03:36.973858] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:11.491 [2024-11-30 00:03:36.975099] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:11.491 [2024-11-30 00:03:36.975124] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:11.491 Passthru0 00:05:11.491 00:03:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.491 00:03:36 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:11.491 00:03:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.491 00:03:36 -- common/autotest_common.sh@10 -- # set +x 00:05:11.491 00:03:37 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.491 00:03:37 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:11.491 { 00:05:11.491 "name": "Malloc2", 00:05:11.491 "aliases": [ 00:05:11.491 "4e16fd5b-bb91-421e-9f64-93dae7b555a0" 00:05:11.491 ], 00:05:11.491 "product_name": "Malloc disk", 00:05:11.491 "block_size": 512, 00:05:11.491 "num_blocks": 16384, 00:05:11.491 "uuid": "4e16fd5b-bb91-421e-9f64-93dae7b555a0", 00:05:11.491 "assigned_rate_limits": { 00:05:11.491 "rw_ios_per_sec": 0, 00:05:11.491 "rw_mbytes_per_sec": 0, 00:05:11.491 "r_mbytes_per_sec": 0, 00:05:11.491 "w_mbytes_per_sec": 0 00:05:11.491 }, 00:05:11.491 "claimed": true, 00:05:11.491 "claim_type": "exclusive_write", 00:05:11.491 "zoned": false, 00:05:11.491 "supported_io_types": { 00:05:11.491 "read": true, 00:05:11.491 "write": true, 00:05:11.491 "unmap": true, 00:05:11.491 "write_zeroes": true, 00:05:11.491 "flush": true, 00:05:11.491 "reset": true, 00:05:11.491 "compare": false, 00:05:11.491 "compare_and_write": false, 00:05:11.491 "abort": true, 00:05:11.491 "nvme_admin": false, 00:05:11.491 "nvme_io": false 00:05:11.491 }, 00:05:11.491 "memory_domains": [ 00:05:11.491 { 00:05:11.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:11.491 "dma_device_type": 2 00:05:11.491 } 00:05:11.491 ], 00:05:11.491 "driver_specific": {} 00:05:11.491 }, 00:05:11.491 { 00:05:11.491 "name": "Passthru0", 00:05:11.491 "aliases": [ 00:05:11.491 "1dbf29f4-4d0c-5e8c-9613-cf7e6f6d7edb" 00:05:11.491 ], 00:05:11.491 "product_name": "passthru", 00:05:11.491 "block_size": 512, 00:05:11.491 "num_blocks": 16384, 00:05:11.491 "uuid": "1dbf29f4-4d0c-5e8c-9613-cf7e6f6d7edb", 00:05:11.491 "assigned_rate_limits": { 00:05:11.491 "rw_ios_per_sec": 0, 00:05:11.491 "rw_mbytes_per_sec": 0, 00:05:11.491 "r_mbytes_per_sec": 0, 00:05:11.491 "w_mbytes_per_sec": 0 00:05:11.491 }, 00:05:11.491 "claimed": false, 00:05:11.491 "zoned": false, 00:05:11.491 "supported_io_types": { 00:05:11.491 "read": true, 00:05:11.491 "write": true, 00:05:11.491 "unmap": true, 00:05:11.491 "write_zeroes": true, 00:05:11.491 "flush": true, 00:05:11.491 "reset": true, 00:05:11.491 "compare": false, 00:05:11.491 "compare_and_write": false, 00:05:11.491 "abort": true, 00:05:11.491 "nvme_admin": false, 00:05:11.491 "nvme_io": false 00:05:11.491 }, 00:05:11.491 "memory_domains": [ 00:05:11.491 { 00:05:11.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:11.491 "dma_device_type": 2 00:05:11.491 } 00:05:11.491 ], 00:05:11.491 "driver_specific": { 00:05:11.491 "passthru": { 00:05:11.491 "name": "Passthru0", 00:05:11.491 "base_bdev_name": "Malloc2" 00:05:11.491 } 00:05:11.491 } 00:05:11.491 } 00:05:11.491 ]' 00:05:11.491 00:03:37 -- rpc/rpc.sh@21 -- # jq length 00:05:11.491 00:03:37 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:11.491 00:03:37 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:11.491 00:03:37 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.491 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:11.750 00:03:37 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.750 00:03:37 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:11.750 00:03:37 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.750 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:11.750 00:03:37 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.750 00:03:37 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:11.750 00:03:37 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:11.750 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:11.750 00:03:37 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:11.750 00:03:37 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:11.750 00:03:37 -- rpc/rpc.sh@26 -- # jq length 00:05:11.750 00:03:37 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:11.750 00:05:11.750 real 0m0.251s 00:05:11.750 user 0m0.160s 00:05:11.750 sys 0m0.041s 00:05:11.750 00:03:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:11.750 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:11.750 ************************************ 00:05:11.750 END TEST rpc_daemon_integrity 00:05:11.750 ************************************ 00:05:11.750 00:03:37 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:11.750 00:03:37 -- rpc/rpc.sh@84 -- # killprocess 2697284 00:05:11.750 00:03:37 -- common/autotest_common.sh@936 -- # '[' -z 2697284 ']' 00:05:11.750 00:03:37 -- common/autotest_common.sh@940 -- # kill -0 2697284 00:05:11.750 00:03:37 -- common/autotest_common.sh@941 -- # uname 00:05:11.750 00:03:37 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:11.750 00:03:37 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2697284 00:05:11.750 00:03:37 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:11.750 00:03:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:11.750 00:03:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2697284' 00:05:11.750 killing process with pid 2697284 00:05:11.750 00:03:37 -- common/autotest_common.sh@955 -- # kill 2697284 00:05:11.750 00:03:37 -- common/autotest_common.sh@960 -- # wait 2697284 00:05:12.009 00:05:12.009 real 0m2.422s 00:05:12.009 user 0m3.001s 00:05:12.009 sys 0m0.747s 00:05:12.009 00:03:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:12.009 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:12.009 ************************************ 00:05:12.009 END TEST rpc 00:05:12.009 ************************************ 00:05:12.009 00:03:37 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:12.009 00:03:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:12.009 00:03:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:12.009 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:12.009 ************************************ 00:05:12.009 START TEST rpc_client 00:05:12.009 ************************************ 00:05:12.009 00:03:37 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:12.268 * Looking for test storage... 00:05:12.268 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:12.268 00:03:37 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:12.268 00:03:37 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:12.268 00:03:37 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:12.268 00:03:37 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:12.268 00:03:37 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:12.268 00:03:37 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:12.268 00:03:37 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:12.268 00:03:37 -- scripts/common.sh@335 -- # IFS=.-: 00:05:12.269 00:03:37 -- scripts/common.sh@335 -- # read -ra ver1 00:05:12.269 00:03:37 -- scripts/common.sh@336 -- # IFS=.-: 00:05:12.269 00:03:37 -- scripts/common.sh@336 -- # read -ra ver2 00:05:12.269 00:03:37 -- scripts/common.sh@337 -- # local 'op=<' 00:05:12.269 00:03:37 -- scripts/common.sh@339 -- # ver1_l=2 00:05:12.269 00:03:37 -- scripts/common.sh@340 -- # ver2_l=1 00:05:12.269 00:03:37 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:12.269 00:03:37 -- scripts/common.sh@343 -- # case "$op" in 00:05:12.269 00:03:37 -- scripts/common.sh@344 -- # : 1 00:05:12.269 00:03:37 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:12.269 00:03:37 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:12.269 00:03:37 -- scripts/common.sh@364 -- # decimal 1 00:05:12.269 00:03:37 -- scripts/common.sh@352 -- # local d=1 00:05:12.269 00:03:37 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:12.269 00:03:37 -- scripts/common.sh@354 -- # echo 1 00:05:12.269 00:03:37 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:12.269 00:03:37 -- scripts/common.sh@365 -- # decimal 2 00:05:12.269 00:03:37 -- scripts/common.sh@352 -- # local d=2 00:05:12.269 00:03:37 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:12.269 00:03:37 -- scripts/common.sh@354 -- # echo 2 00:05:12.269 00:03:37 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:12.269 00:03:37 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:12.269 00:03:37 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:12.269 00:03:37 -- scripts/common.sh@367 -- # return 0 00:05:12.269 00:03:37 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:12.269 00:03:37 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:12.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.269 --rc genhtml_branch_coverage=1 00:05:12.269 --rc genhtml_function_coverage=1 00:05:12.269 --rc genhtml_legend=1 00:05:12.269 --rc geninfo_all_blocks=1 00:05:12.269 --rc geninfo_unexecuted_blocks=1 00:05:12.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.269 ' 00:05:12.269 00:03:37 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:12.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.269 --rc genhtml_branch_coverage=1 00:05:12.269 --rc genhtml_function_coverage=1 00:05:12.269 --rc genhtml_legend=1 00:05:12.269 --rc geninfo_all_blocks=1 00:05:12.269 --rc geninfo_unexecuted_blocks=1 00:05:12.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.269 ' 00:05:12.269 00:03:37 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:12.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.269 --rc genhtml_branch_coverage=1 00:05:12.269 --rc genhtml_function_coverage=1 00:05:12.269 --rc genhtml_legend=1 00:05:12.269 --rc geninfo_all_blocks=1 00:05:12.269 --rc geninfo_unexecuted_blocks=1 00:05:12.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.269 ' 00:05:12.269 00:03:37 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:12.269 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.269 --rc genhtml_branch_coverage=1 00:05:12.269 --rc genhtml_function_coverage=1 00:05:12.269 --rc genhtml_legend=1 00:05:12.269 --rc geninfo_all_blocks=1 00:05:12.269 --rc geninfo_unexecuted_blocks=1 00:05:12.269 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.269 ' 00:05:12.269 00:03:37 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:12.269 OK 00:05:12.269 00:03:37 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:12.269 00:05:12.269 real 0m0.211s 00:05:12.269 user 0m0.112s 00:05:12.269 sys 0m0.115s 00:05:12.269 00:03:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:12.269 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:12.269 ************************************ 00:05:12.269 END TEST rpc_client 00:05:12.269 ************************************ 00:05:12.269 00:03:37 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:12.269 00:03:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:12.269 00:03:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:12.269 00:03:37 -- common/autotest_common.sh@10 -- # set +x 00:05:12.269 ************************************ 00:05:12.269 START TEST json_config 00:05:12.269 ************************************ 00:05:12.528 00:03:37 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:12.528 00:03:37 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:12.528 00:03:37 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:12.528 00:03:37 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:12.528 00:03:37 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:12.528 00:03:37 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:12.528 00:03:37 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:12.528 00:03:37 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:12.528 00:03:37 -- scripts/common.sh@335 -- # IFS=.-: 00:05:12.528 00:03:37 -- scripts/common.sh@335 -- # read -ra ver1 00:05:12.528 00:03:37 -- scripts/common.sh@336 -- # IFS=.-: 00:05:12.528 00:03:37 -- scripts/common.sh@336 -- # read -ra ver2 00:05:12.528 00:03:37 -- scripts/common.sh@337 -- # local 'op=<' 00:05:12.528 00:03:37 -- scripts/common.sh@339 -- # ver1_l=2 00:05:12.528 00:03:37 -- scripts/common.sh@340 -- # ver2_l=1 00:05:12.528 00:03:37 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:12.528 00:03:37 -- scripts/common.sh@343 -- # case "$op" in 00:05:12.528 00:03:37 -- scripts/common.sh@344 -- # : 1 00:05:12.528 00:03:37 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:12.528 00:03:37 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:12.528 00:03:37 -- scripts/common.sh@364 -- # decimal 1 00:05:12.528 00:03:37 -- scripts/common.sh@352 -- # local d=1 00:05:12.528 00:03:37 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:12.529 00:03:37 -- scripts/common.sh@354 -- # echo 1 00:05:12.529 00:03:37 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:12.529 00:03:37 -- scripts/common.sh@365 -- # decimal 2 00:05:12.529 00:03:37 -- scripts/common.sh@352 -- # local d=2 00:05:12.529 00:03:37 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:12.529 00:03:37 -- scripts/common.sh@354 -- # echo 2 00:05:12.529 00:03:37 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:12.529 00:03:37 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:12.529 00:03:37 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:12.529 00:03:37 -- scripts/common.sh@367 -- # return 0 00:05:12.529 00:03:37 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:12.529 00:03:37 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:12.529 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.529 --rc genhtml_branch_coverage=1 00:05:12.529 --rc genhtml_function_coverage=1 00:05:12.529 --rc genhtml_legend=1 00:05:12.529 --rc geninfo_all_blocks=1 00:05:12.529 --rc geninfo_unexecuted_blocks=1 00:05:12.529 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.529 ' 00:05:12.529 00:03:37 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:12.529 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.529 --rc genhtml_branch_coverage=1 00:05:12.529 --rc genhtml_function_coverage=1 00:05:12.529 --rc genhtml_legend=1 00:05:12.529 --rc geninfo_all_blocks=1 00:05:12.529 --rc geninfo_unexecuted_blocks=1 00:05:12.529 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.529 ' 00:05:12.529 00:03:37 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:12.529 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.529 --rc genhtml_branch_coverage=1 00:05:12.529 --rc genhtml_function_coverage=1 00:05:12.529 --rc genhtml_legend=1 00:05:12.529 --rc geninfo_all_blocks=1 00:05:12.529 --rc geninfo_unexecuted_blocks=1 00:05:12.529 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.529 ' 00:05:12.529 00:03:37 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:12.529 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.529 --rc genhtml_branch_coverage=1 00:05:12.529 --rc genhtml_function_coverage=1 00:05:12.529 --rc genhtml_legend=1 00:05:12.529 --rc geninfo_all_blocks=1 00:05:12.529 --rc geninfo_unexecuted_blocks=1 00:05:12.529 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.529 ' 00:05:12.529 00:03:37 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:12.529 00:03:37 -- nvmf/common.sh@7 -- # uname -s 00:05:12.529 00:03:37 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:12.529 00:03:37 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:12.529 00:03:37 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:12.529 00:03:37 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:12.529 00:03:37 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:12.529 00:03:37 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:12.529 00:03:37 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:12.529 00:03:37 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:12.529 00:03:37 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:12.529 00:03:37 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:12.529 00:03:37 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:12.529 00:03:37 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:12.529 00:03:37 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:12.529 00:03:37 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:12.529 00:03:37 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:12.529 00:03:37 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:12.529 00:03:38 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:12.529 00:03:38 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:12.529 00:03:38 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:12.529 00:03:38 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.529 00:03:38 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.529 00:03:38 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.529 00:03:38 -- paths/export.sh@5 -- # export PATH 00:05:12.529 00:03:38 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.529 00:03:38 -- nvmf/common.sh@46 -- # : 0 00:05:12.529 00:03:38 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:12.529 00:03:38 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:12.529 00:03:38 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:12.529 00:03:38 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:12.529 00:03:38 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:12.529 00:03:38 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:12.529 00:03:38 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:12.529 00:03:38 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:12.529 00:03:38 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:12.529 00:03:38 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:12.529 00:03:38 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:12.529 00:03:38 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:12.529 00:03:38 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:12.529 WARNING: No tests are enabled so not running JSON configuration tests 00:05:12.529 00:03:38 -- json_config/json_config.sh@27 -- # exit 0 00:05:12.529 00:05:12.529 real 0m0.188s 00:05:12.529 user 0m0.121s 00:05:12.529 sys 0m0.076s 00:05:12.529 00:03:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:12.529 00:03:38 -- common/autotest_common.sh@10 -- # set +x 00:05:12.529 ************************************ 00:05:12.529 END TEST json_config 00:05:12.529 ************************************ 00:05:12.529 00:03:38 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:12.529 00:03:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:12.529 00:03:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:12.529 00:03:38 -- common/autotest_common.sh@10 -- # set +x 00:05:12.529 ************************************ 00:05:12.529 START TEST json_config_extra_key 00:05:12.529 ************************************ 00:05:12.529 00:03:38 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:12.788 00:03:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:12.788 00:03:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:12.788 00:03:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:12.788 00:03:38 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:12.788 00:03:38 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:12.788 00:03:38 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:12.788 00:03:38 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:12.788 00:03:38 -- scripts/common.sh@335 -- # IFS=.-: 00:05:12.788 00:03:38 -- scripts/common.sh@335 -- # read -ra ver1 00:05:12.788 00:03:38 -- scripts/common.sh@336 -- # IFS=.-: 00:05:12.788 00:03:38 -- scripts/common.sh@336 -- # read -ra ver2 00:05:12.788 00:03:38 -- scripts/common.sh@337 -- # local 'op=<' 00:05:12.788 00:03:38 -- scripts/common.sh@339 -- # ver1_l=2 00:05:12.788 00:03:38 -- scripts/common.sh@340 -- # ver2_l=1 00:05:12.788 00:03:38 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:12.788 00:03:38 -- scripts/common.sh@343 -- # case "$op" in 00:05:12.788 00:03:38 -- scripts/common.sh@344 -- # : 1 00:05:12.788 00:03:38 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:12.788 00:03:38 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:12.788 00:03:38 -- scripts/common.sh@364 -- # decimal 1 00:05:12.788 00:03:38 -- scripts/common.sh@352 -- # local d=1 00:05:12.788 00:03:38 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:12.788 00:03:38 -- scripts/common.sh@354 -- # echo 1 00:05:12.788 00:03:38 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:12.788 00:03:38 -- scripts/common.sh@365 -- # decimal 2 00:05:12.788 00:03:38 -- scripts/common.sh@352 -- # local d=2 00:05:12.788 00:03:38 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:12.788 00:03:38 -- scripts/common.sh@354 -- # echo 2 00:05:12.788 00:03:38 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:12.788 00:03:38 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:12.788 00:03:38 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:12.788 00:03:38 -- scripts/common.sh@367 -- # return 0 00:05:12.788 00:03:38 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:12.788 00:03:38 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:12.788 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.788 --rc genhtml_branch_coverage=1 00:05:12.788 --rc genhtml_function_coverage=1 00:05:12.788 --rc genhtml_legend=1 00:05:12.788 --rc geninfo_all_blocks=1 00:05:12.788 --rc geninfo_unexecuted_blocks=1 00:05:12.788 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.788 ' 00:05:12.788 00:03:38 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:12.788 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.788 --rc genhtml_branch_coverage=1 00:05:12.788 --rc genhtml_function_coverage=1 00:05:12.788 --rc genhtml_legend=1 00:05:12.788 --rc geninfo_all_blocks=1 00:05:12.788 --rc geninfo_unexecuted_blocks=1 00:05:12.788 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.788 ' 00:05:12.788 00:03:38 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:12.788 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.788 --rc genhtml_branch_coverage=1 00:05:12.788 --rc genhtml_function_coverage=1 00:05:12.788 --rc genhtml_legend=1 00:05:12.788 --rc geninfo_all_blocks=1 00:05:12.788 --rc geninfo_unexecuted_blocks=1 00:05:12.788 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.788 ' 00:05:12.788 00:03:38 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:12.788 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.788 --rc genhtml_branch_coverage=1 00:05:12.788 --rc genhtml_function_coverage=1 00:05:12.788 --rc genhtml_legend=1 00:05:12.788 --rc geninfo_all_blocks=1 00:05:12.788 --rc geninfo_unexecuted_blocks=1 00:05:12.788 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.788 ' 00:05:12.788 00:03:38 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:12.788 00:03:38 -- nvmf/common.sh@7 -- # uname -s 00:05:12.788 00:03:38 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:12.788 00:03:38 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:12.788 00:03:38 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:12.788 00:03:38 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:12.788 00:03:38 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:12.788 00:03:38 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:12.788 00:03:38 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:12.788 00:03:38 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:12.788 00:03:38 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:12.788 00:03:38 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:12.788 00:03:38 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:12.788 00:03:38 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:12.788 00:03:38 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:12.788 00:03:38 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:12.788 00:03:38 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:12.788 00:03:38 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:12.788 00:03:38 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:12.788 00:03:38 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:12.788 00:03:38 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:12.788 00:03:38 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.788 00:03:38 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.788 00:03:38 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.788 00:03:38 -- paths/export.sh@5 -- # export PATH 00:05:12.788 00:03:38 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:12.789 00:03:38 -- nvmf/common.sh@46 -- # : 0 00:05:12.789 00:03:38 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:12.789 00:03:38 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:12.789 00:03:38 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:12.789 00:03:38 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:12.789 00:03:38 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:12.789 00:03:38 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:12.789 00:03:38 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:12.789 00:03:38 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:12.789 INFO: launching applications... 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=2698093 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:12.789 Waiting for target to run... 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 2698093 /var/tmp/spdk_tgt.sock 00:05:12.789 00:03:38 -- common/autotest_common.sh@829 -- # '[' -z 2698093 ']' 00:05:12.789 00:03:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:12.789 00:03:38 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:12.789 00:03:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:12.789 00:03:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:12.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:12.789 00:03:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:12.789 00:03:38 -- common/autotest_common.sh@10 -- # set +x 00:05:12.789 [2024-11-30 00:03:38.276521] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:12.789 [2024-11-30 00:03:38.276612] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698093 ] 00:05:12.789 EAL: No free 2048 kB hugepages reported on node 1 00:05:13.356 [2024-11-30 00:03:38.712629] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:13.356 [2024-11-30 00:03:38.802150] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:13.356 [2024-11-30 00:03:38.802279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:13.615 00:03:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:13.615 00:03:39 -- common/autotest_common.sh@862 -- # return 0 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:13.615 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:13.615 INFO: shutting down applications... 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 2698093 ]] 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 2698093 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2698093 00:05:13.615 00:03:39 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2698093 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:14.182 SPDK target shutdown done 00:05:14.182 00:03:39 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:14.182 Success 00:05:14.182 00:05:14.182 real 0m1.557s 00:05:14.182 user 0m1.166s 00:05:14.182 sys 0m0.555s 00:05:14.182 00:03:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:14.182 00:03:39 -- common/autotest_common.sh@10 -- # set +x 00:05:14.182 ************************************ 00:05:14.182 END TEST json_config_extra_key 00:05:14.182 ************************************ 00:05:14.182 00:03:39 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:14.182 00:03:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:14.182 00:03:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:14.182 00:03:39 -- common/autotest_common.sh@10 -- # set +x 00:05:14.182 ************************************ 00:05:14.182 START TEST alias_rpc 00:05:14.182 ************************************ 00:05:14.182 00:03:39 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:14.441 * Looking for test storage... 00:05:14.441 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:14.441 00:03:39 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:14.441 00:03:39 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:14.441 00:03:39 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:14.441 00:03:39 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:14.441 00:03:39 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:14.441 00:03:39 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:14.441 00:03:39 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:14.441 00:03:39 -- scripts/common.sh@335 -- # IFS=.-: 00:05:14.441 00:03:39 -- scripts/common.sh@335 -- # read -ra ver1 00:05:14.441 00:03:39 -- scripts/common.sh@336 -- # IFS=.-: 00:05:14.441 00:03:39 -- scripts/common.sh@336 -- # read -ra ver2 00:05:14.441 00:03:39 -- scripts/common.sh@337 -- # local 'op=<' 00:05:14.441 00:03:39 -- scripts/common.sh@339 -- # ver1_l=2 00:05:14.441 00:03:39 -- scripts/common.sh@340 -- # ver2_l=1 00:05:14.441 00:03:39 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:14.441 00:03:39 -- scripts/common.sh@343 -- # case "$op" in 00:05:14.441 00:03:39 -- scripts/common.sh@344 -- # : 1 00:05:14.441 00:03:39 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:14.441 00:03:39 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:14.441 00:03:39 -- scripts/common.sh@364 -- # decimal 1 00:05:14.441 00:03:39 -- scripts/common.sh@352 -- # local d=1 00:05:14.441 00:03:39 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:14.441 00:03:39 -- scripts/common.sh@354 -- # echo 1 00:05:14.441 00:03:39 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:14.441 00:03:39 -- scripts/common.sh@365 -- # decimal 2 00:05:14.441 00:03:39 -- scripts/common.sh@352 -- # local d=2 00:05:14.441 00:03:39 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:14.441 00:03:39 -- scripts/common.sh@354 -- # echo 2 00:05:14.441 00:03:39 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:14.441 00:03:39 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:14.441 00:03:39 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:14.441 00:03:39 -- scripts/common.sh@367 -- # return 0 00:05:14.441 00:03:39 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:14.441 00:03:39 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:14.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.441 --rc genhtml_branch_coverage=1 00:05:14.441 --rc genhtml_function_coverage=1 00:05:14.441 --rc genhtml_legend=1 00:05:14.441 --rc geninfo_all_blocks=1 00:05:14.441 --rc geninfo_unexecuted_blocks=1 00:05:14.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.441 ' 00:05:14.441 00:03:39 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:14.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.441 --rc genhtml_branch_coverage=1 00:05:14.441 --rc genhtml_function_coverage=1 00:05:14.441 --rc genhtml_legend=1 00:05:14.441 --rc geninfo_all_blocks=1 00:05:14.441 --rc geninfo_unexecuted_blocks=1 00:05:14.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.441 ' 00:05:14.441 00:03:39 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:14.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.441 --rc genhtml_branch_coverage=1 00:05:14.441 --rc genhtml_function_coverage=1 00:05:14.441 --rc genhtml_legend=1 00:05:14.441 --rc geninfo_all_blocks=1 00:05:14.441 --rc geninfo_unexecuted_blocks=1 00:05:14.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.441 ' 00:05:14.441 00:03:39 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:14.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.441 --rc genhtml_branch_coverage=1 00:05:14.441 --rc genhtml_function_coverage=1 00:05:14.441 --rc genhtml_legend=1 00:05:14.441 --rc geninfo_all_blocks=1 00:05:14.441 --rc geninfo_unexecuted_blocks=1 00:05:14.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.441 ' 00:05:14.441 00:03:39 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:14.441 00:03:39 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2698419 00:05:14.442 00:03:39 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:14.442 00:03:39 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2698419 00:05:14.442 00:03:39 -- common/autotest_common.sh@829 -- # '[' -z 2698419 ']' 00:05:14.442 00:03:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.442 00:03:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:14.442 00:03:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.442 00:03:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:14.442 00:03:39 -- common/autotest_common.sh@10 -- # set +x 00:05:14.442 [2024-11-30 00:03:39.876685] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:14.442 [2024-11-30 00:03:39.876773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698419 ] 00:05:14.442 EAL: No free 2048 kB hugepages reported on node 1 00:05:14.442 [2024-11-30 00:03:39.943961] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.701 [2024-11-30 00:03:40.020224] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:14.701 [2024-11-30 00:03:40.020372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:15.268 00:03:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:15.268 00:03:40 -- common/autotest_common.sh@862 -- # return 0 00:05:15.268 00:03:40 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:15.527 00:03:40 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2698419 00:05:15.527 00:03:40 -- common/autotest_common.sh@936 -- # '[' -z 2698419 ']' 00:05:15.527 00:03:40 -- common/autotest_common.sh@940 -- # kill -0 2698419 00:05:15.527 00:03:40 -- common/autotest_common.sh@941 -- # uname 00:05:15.527 00:03:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:15.527 00:03:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2698419 00:05:15.527 00:03:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:15.527 00:03:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:15.527 00:03:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2698419' 00:05:15.527 killing process with pid 2698419 00:05:15.527 00:03:40 -- common/autotest_common.sh@955 -- # kill 2698419 00:05:15.527 00:03:40 -- common/autotest_common.sh@960 -- # wait 2698419 00:05:15.787 00:05:15.787 real 0m1.612s 00:05:15.787 user 0m1.713s 00:05:15.787 sys 0m0.479s 00:05:15.787 00:03:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:15.787 00:03:41 -- common/autotest_common.sh@10 -- # set +x 00:05:15.787 ************************************ 00:05:15.787 END TEST alias_rpc 00:05:15.787 ************************************ 00:05:15.787 00:03:41 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:15.787 00:03:41 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:15.787 00:03:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:15.787 00:03:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:15.787 00:03:41 -- common/autotest_common.sh@10 -- # set +x 00:05:15.787 ************************************ 00:05:15.787 START TEST spdkcli_tcp 00:05:15.787 ************************************ 00:05:15.787 00:03:41 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:16.047 * Looking for test storage... 00:05:16.047 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:16.047 00:03:41 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:16.047 00:03:41 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:16.047 00:03:41 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:16.047 00:03:41 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:16.047 00:03:41 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:16.047 00:03:41 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:16.047 00:03:41 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:16.047 00:03:41 -- scripts/common.sh@335 -- # IFS=.-: 00:05:16.047 00:03:41 -- scripts/common.sh@335 -- # read -ra ver1 00:05:16.047 00:03:41 -- scripts/common.sh@336 -- # IFS=.-: 00:05:16.047 00:03:41 -- scripts/common.sh@336 -- # read -ra ver2 00:05:16.047 00:03:41 -- scripts/common.sh@337 -- # local 'op=<' 00:05:16.047 00:03:41 -- scripts/common.sh@339 -- # ver1_l=2 00:05:16.047 00:03:41 -- scripts/common.sh@340 -- # ver2_l=1 00:05:16.047 00:03:41 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:16.047 00:03:41 -- scripts/common.sh@343 -- # case "$op" in 00:05:16.047 00:03:41 -- scripts/common.sh@344 -- # : 1 00:05:16.047 00:03:41 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:16.047 00:03:41 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:16.047 00:03:41 -- scripts/common.sh@364 -- # decimal 1 00:05:16.047 00:03:41 -- scripts/common.sh@352 -- # local d=1 00:05:16.047 00:03:41 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:16.047 00:03:41 -- scripts/common.sh@354 -- # echo 1 00:05:16.047 00:03:41 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:16.047 00:03:41 -- scripts/common.sh@365 -- # decimal 2 00:05:16.047 00:03:41 -- scripts/common.sh@352 -- # local d=2 00:05:16.047 00:03:41 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:16.047 00:03:41 -- scripts/common.sh@354 -- # echo 2 00:05:16.047 00:03:41 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:16.047 00:03:41 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:16.047 00:03:41 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:16.047 00:03:41 -- scripts/common.sh@367 -- # return 0 00:05:16.047 00:03:41 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:16.047 00:03:41 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:16.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.047 --rc genhtml_branch_coverage=1 00:05:16.047 --rc genhtml_function_coverage=1 00:05:16.047 --rc genhtml_legend=1 00:05:16.047 --rc geninfo_all_blocks=1 00:05:16.047 --rc geninfo_unexecuted_blocks=1 00:05:16.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:16.047 ' 00:05:16.047 00:03:41 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:16.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.047 --rc genhtml_branch_coverage=1 00:05:16.047 --rc genhtml_function_coverage=1 00:05:16.047 --rc genhtml_legend=1 00:05:16.047 --rc geninfo_all_blocks=1 00:05:16.047 --rc geninfo_unexecuted_blocks=1 00:05:16.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:16.047 ' 00:05:16.047 00:03:41 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:16.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.047 --rc genhtml_branch_coverage=1 00:05:16.047 --rc genhtml_function_coverage=1 00:05:16.047 --rc genhtml_legend=1 00:05:16.047 --rc geninfo_all_blocks=1 00:05:16.047 --rc geninfo_unexecuted_blocks=1 00:05:16.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:16.047 ' 00:05:16.047 00:03:41 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:16.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.047 --rc genhtml_branch_coverage=1 00:05:16.047 --rc genhtml_function_coverage=1 00:05:16.047 --rc genhtml_legend=1 00:05:16.047 --rc geninfo_all_blocks=1 00:05:16.047 --rc geninfo_unexecuted_blocks=1 00:05:16.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:16.047 ' 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:16.047 00:03:41 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:16.047 00:03:41 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:16.047 00:03:41 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:16.047 00:03:41 -- common/autotest_common.sh@10 -- # set +x 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2698754 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:16.047 00:03:41 -- spdkcli/tcp.sh@27 -- # waitforlisten 2698754 00:05:16.047 00:03:41 -- common/autotest_common.sh@829 -- # '[' -z 2698754 ']' 00:05:16.047 00:03:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:16.047 00:03:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:16.047 00:03:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:16.047 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:16.047 00:03:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:16.047 00:03:41 -- common/autotest_common.sh@10 -- # set +x 00:05:16.047 [2024-11-30 00:03:41.520106] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:16.047 [2024-11-30 00:03:41.520173] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698754 ] 00:05:16.048 EAL: No free 2048 kB hugepages reported on node 1 00:05:16.048 [2024-11-30 00:03:41.588263] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:16.307 [2024-11-30 00:03:41.663958] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:16.307 [2024-11-30 00:03:41.664138] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:16.307 [2024-11-30 00:03:41.664141] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.875 00:03:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:16.875 00:03:42 -- common/autotest_common.sh@862 -- # return 0 00:05:16.875 00:03:42 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:16.875 00:03:42 -- spdkcli/tcp.sh@31 -- # socat_pid=2699015 00:05:16.875 00:03:42 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:17.135 [ 00:05:17.135 "spdk_get_version", 00:05:17.135 "rpc_get_methods", 00:05:17.135 "trace_get_info", 00:05:17.135 "trace_get_tpoint_group_mask", 00:05:17.135 "trace_disable_tpoint_group", 00:05:17.135 "trace_enable_tpoint_group", 00:05:17.135 "trace_clear_tpoint_mask", 00:05:17.135 "trace_set_tpoint_mask", 00:05:17.135 "vfu_tgt_set_base_path", 00:05:17.135 "framework_get_pci_devices", 00:05:17.135 "framework_get_config", 00:05:17.135 "framework_get_subsystems", 00:05:17.135 "iobuf_get_stats", 00:05:17.135 "iobuf_set_options", 00:05:17.135 "sock_set_default_impl", 00:05:17.135 "sock_impl_set_options", 00:05:17.135 "sock_impl_get_options", 00:05:17.135 "vmd_rescan", 00:05:17.135 "vmd_remove_device", 00:05:17.135 "vmd_enable", 00:05:17.135 "accel_get_stats", 00:05:17.135 "accel_set_options", 00:05:17.135 "accel_set_driver", 00:05:17.135 "accel_crypto_key_destroy", 00:05:17.135 "accel_crypto_keys_get", 00:05:17.135 "accel_crypto_key_create", 00:05:17.135 "accel_assign_opc", 00:05:17.135 "accel_get_module_info", 00:05:17.135 "accel_get_opc_assignments", 00:05:17.135 "notify_get_notifications", 00:05:17.135 "notify_get_types", 00:05:17.135 "bdev_get_histogram", 00:05:17.135 "bdev_enable_histogram", 00:05:17.135 "bdev_set_qos_limit", 00:05:17.135 "bdev_set_qd_sampling_period", 00:05:17.135 "bdev_get_bdevs", 00:05:17.135 "bdev_reset_iostat", 00:05:17.135 "bdev_get_iostat", 00:05:17.135 "bdev_examine", 00:05:17.135 "bdev_wait_for_examine", 00:05:17.135 "bdev_set_options", 00:05:17.135 "scsi_get_devices", 00:05:17.135 "thread_set_cpumask", 00:05:17.135 "framework_get_scheduler", 00:05:17.135 "framework_set_scheduler", 00:05:17.135 "framework_get_reactors", 00:05:17.135 "thread_get_io_channels", 00:05:17.135 "thread_get_pollers", 00:05:17.135 "thread_get_stats", 00:05:17.135 "framework_monitor_context_switch", 00:05:17.135 "spdk_kill_instance", 00:05:17.135 "log_enable_timestamps", 00:05:17.135 "log_get_flags", 00:05:17.135 "log_clear_flag", 00:05:17.135 "log_set_flag", 00:05:17.135 "log_get_level", 00:05:17.135 "log_set_level", 00:05:17.135 "log_get_print_level", 00:05:17.135 "log_set_print_level", 00:05:17.135 "framework_enable_cpumask_locks", 00:05:17.135 "framework_disable_cpumask_locks", 00:05:17.135 "framework_wait_init", 00:05:17.135 "framework_start_init", 00:05:17.135 "virtio_blk_create_transport", 00:05:17.135 "virtio_blk_get_transports", 00:05:17.135 "vhost_controller_set_coalescing", 00:05:17.135 "vhost_get_controllers", 00:05:17.135 "vhost_delete_controller", 00:05:17.135 "vhost_create_blk_controller", 00:05:17.135 "vhost_scsi_controller_remove_target", 00:05:17.135 "vhost_scsi_controller_add_target", 00:05:17.135 "vhost_start_scsi_controller", 00:05:17.135 "vhost_create_scsi_controller", 00:05:17.135 "ublk_recover_disk", 00:05:17.135 "ublk_get_disks", 00:05:17.135 "ublk_stop_disk", 00:05:17.135 "ublk_start_disk", 00:05:17.135 "ublk_destroy_target", 00:05:17.135 "ublk_create_target", 00:05:17.135 "nbd_get_disks", 00:05:17.135 "nbd_stop_disk", 00:05:17.135 "nbd_start_disk", 00:05:17.135 "env_dpdk_get_mem_stats", 00:05:17.135 "nvmf_subsystem_get_listeners", 00:05:17.135 "nvmf_subsystem_get_qpairs", 00:05:17.135 "nvmf_subsystem_get_controllers", 00:05:17.135 "nvmf_get_stats", 00:05:17.135 "nvmf_get_transports", 00:05:17.135 "nvmf_create_transport", 00:05:17.135 "nvmf_get_targets", 00:05:17.135 "nvmf_delete_target", 00:05:17.135 "nvmf_create_target", 00:05:17.135 "nvmf_subsystem_allow_any_host", 00:05:17.135 "nvmf_subsystem_remove_host", 00:05:17.135 "nvmf_subsystem_add_host", 00:05:17.135 "nvmf_subsystem_remove_ns", 00:05:17.135 "nvmf_subsystem_add_ns", 00:05:17.135 "nvmf_subsystem_listener_set_ana_state", 00:05:17.135 "nvmf_discovery_get_referrals", 00:05:17.135 "nvmf_discovery_remove_referral", 00:05:17.135 "nvmf_discovery_add_referral", 00:05:17.135 "nvmf_subsystem_remove_listener", 00:05:17.135 "nvmf_subsystem_add_listener", 00:05:17.135 "nvmf_delete_subsystem", 00:05:17.135 "nvmf_create_subsystem", 00:05:17.135 "nvmf_get_subsystems", 00:05:17.135 "nvmf_set_crdt", 00:05:17.135 "nvmf_set_config", 00:05:17.135 "nvmf_set_max_subsystems", 00:05:17.135 "iscsi_set_options", 00:05:17.135 "iscsi_get_auth_groups", 00:05:17.135 "iscsi_auth_group_remove_secret", 00:05:17.135 "iscsi_auth_group_add_secret", 00:05:17.135 "iscsi_delete_auth_group", 00:05:17.135 "iscsi_create_auth_group", 00:05:17.135 "iscsi_set_discovery_auth", 00:05:17.135 "iscsi_get_options", 00:05:17.135 "iscsi_target_node_request_logout", 00:05:17.135 "iscsi_target_node_set_redirect", 00:05:17.135 "iscsi_target_node_set_auth", 00:05:17.135 "iscsi_target_node_add_lun", 00:05:17.135 "iscsi_get_connections", 00:05:17.135 "iscsi_portal_group_set_auth", 00:05:17.135 "iscsi_start_portal_group", 00:05:17.135 "iscsi_delete_portal_group", 00:05:17.135 "iscsi_create_portal_group", 00:05:17.135 "iscsi_get_portal_groups", 00:05:17.135 "iscsi_delete_target_node", 00:05:17.135 "iscsi_target_node_remove_pg_ig_maps", 00:05:17.135 "iscsi_target_node_add_pg_ig_maps", 00:05:17.135 "iscsi_create_target_node", 00:05:17.135 "iscsi_get_target_nodes", 00:05:17.135 "iscsi_delete_initiator_group", 00:05:17.135 "iscsi_initiator_group_remove_initiators", 00:05:17.135 "iscsi_initiator_group_add_initiators", 00:05:17.135 "iscsi_create_initiator_group", 00:05:17.135 "iscsi_get_initiator_groups", 00:05:17.135 "vfu_virtio_create_scsi_endpoint", 00:05:17.135 "vfu_virtio_scsi_remove_target", 00:05:17.135 "vfu_virtio_scsi_add_target", 00:05:17.135 "vfu_virtio_create_blk_endpoint", 00:05:17.135 "vfu_virtio_delete_endpoint", 00:05:17.135 "iaa_scan_accel_module", 00:05:17.135 "dsa_scan_accel_module", 00:05:17.135 "ioat_scan_accel_module", 00:05:17.135 "accel_error_inject_error", 00:05:17.135 "bdev_iscsi_delete", 00:05:17.135 "bdev_iscsi_create", 00:05:17.135 "bdev_iscsi_set_options", 00:05:17.135 "bdev_virtio_attach_controller", 00:05:17.135 "bdev_virtio_scsi_get_devices", 00:05:17.135 "bdev_virtio_detach_controller", 00:05:17.135 "bdev_virtio_blk_set_hotplug", 00:05:17.135 "bdev_ftl_set_property", 00:05:17.135 "bdev_ftl_get_properties", 00:05:17.135 "bdev_ftl_get_stats", 00:05:17.135 "bdev_ftl_unmap", 00:05:17.135 "bdev_ftl_unload", 00:05:17.135 "bdev_ftl_delete", 00:05:17.135 "bdev_ftl_load", 00:05:17.135 "bdev_ftl_create", 00:05:17.135 "bdev_aio_delete", 00:05:17.135 "bdev_aio_rescan", 00:05:17.135 "bdev_aio_create", 00:05:17.135 "blobfs_create", 00:05:17.135 "blobfs_detect", 00:05:17.135 "blobfs_set_cache_size", 00:05:17.135 "bdev_zone_block_delete", 00:05:17.135 "bdev_zone_block_create", 00:05:17.135 "bdev_delay_delete", 00:05:17.135 "bdev_delay_create", 00:05:17.135 "bdev_delay_update_latency", 00:05:17.135 "bdev_split_delete", 00:05:17.135 "bdev_split_create", 00:05:17.135 "bdev_error_inject_error", 00:05:17.135 "bdev_error_delete", 00:05:17.135 "bdev_error_create", 00:05:17.135 "bdev_raid_set_options", 00:05:17.135 "bdev_raid_remove_base_bdev", 00:05:17.135 "bdev_raid_add_base_bdev", 00:05:17.135 "bdev_raid_delete", 00:05:17.135 "bdev_raid_create", 00:05:17.135 "bdev_raid_get_bdevs", 00:05:17.135 "bdev_lvol_grow_lvstore", 00:05:17.135 "bdev_lvol_get_lvols", 00:05:17.135 "bdev_lvol_get_lvstores", 00:05:17.135 "bdev_lvol_delete", 00:05:17.135 "bdev_lvol_set_read_only", 00:05:17.135 "bdev_lvol_resize", 00:05:17.135 "bdev_lvol_decouple_parent", 00:05:17.135 "bdev_lvol_inflate", 00:05:17.135 "bdev_lvol_rename", 00:05:17.135 "bdev_lvol_clone_bdev", 00:05:17.135 "bdev_lvol_clone", 00:05:17.135 "bdev_lvol_snapshot", 00:05:17.135 "bdev_lvol_create", 00:05:17.135 "bdev_lvol_delete_lvstore", 00:05:17.135 "bdev_lvol_rename_lvstore", 00:05:17.135 "bdev_lvol_create_lvstore", 00:05:17.135 "bdev_passthru_delete", 00:05:17.135 "bdev_passthru_create", 00:05:17.135 "bdev_nvme_cuse_unregister", 00:05:17.135 "bdev_nvme_cuse_register", 00:05:17.135 "bdev_opal_new_user", 00:05:17.135 "bdev_opal_set_lock_state", 00:05:17.135 "bdev_opal_delete", 00:05:17.135 "bdev_opal_get_info", 00:05:17.135 "bdev_opal_create", 00:05:17.135 "bdev_nvme_opal_revert", 00:05:17.135 "bdev_nvme_opal_init", 00:05:17.135 "bdev_nvme_send_cmd", 00:05:17.135 "bdev_nvme_get_path_iostat", 00:05:17.135 "bdev_nvme_get_mdns_discovery_info", 00:05:17.135 "bdev_nvme_stop_mdns_discovery", 00:05:17.135 "bdev_nvme_start_mdns_discovery", 00:05:17.135 "bdev_nvme_set_multipath_policy", 00:05:17.135 "bdev_nvme_set_preferred_path", 00:05:17.135 "bdev_nvme_get_io_paths", 00:05:17.135 "bdev_nvme_remove_error_injection", 00:05:17.135 "bdev_nvme_add_error_injection", 00:05:17.135 "bdev_nvme_get_discovery_info", 00:05:17.135 "bdev_nvme_stop_discovery", 00:05:17.135 "bdev_nvme_start_discovery", 00:05:17.135 "bdev_nvme_get_controller_health_info", 00:05:17.135 "bdev_nvme_disable_controller", 00:05:17.135 "bdev_nvme_enable_controller", 00:05:17.135 "bdev_nvme_reset_controller", 00:05:17.135 "bdev_nvme_get_transport_statistics", 00:05:17.135 "bdev_nvme_apply_firmware", 00:05:17.135 "bdev_nvme_detach_controller", 00:05:17.135 "bdev_nvme_get_controllers", 00:05:17.135 "bdev_nvme_attach_controller", 00:05:17.135 "bdev_nvme_set_hotplug", 00:05:17.135 "bdev_nvme_set_options", 00:05:17.135 "bdev_null_resize", 00:05:17.135 "bdev_null_delete", 00:05:17.135 "bdev_null_create", 00:05:17.135 "bdev_malloc_delete", 00:05:17.135 "bdev_malloc_create" 00:05:17.135 ] 00:05:17.135 00:03:42 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:17.135 00:03:42 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:17.135 00:03:42 -- common/autotest_common.sh@10 -- # set +x 00:05:17.135 00:03:42 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:17.135 00:03:42 -- spdkcli/tcp.sh@38 -- # killprocess 2698754 00:05:17.135 00:03:42 -- common/autotest_common.sh@936 -- # '[' -z 2698754 ']' 00:05:17.135 00:03:42 -- common/autotest_common.sh@940 -- # kill -0 2698754 00:05:17.135 00:03:42 -- common/autotest_common.sh@941 -- # uname 00:05:17.135 00:03:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:17.135 00:03:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2698754 00:05:17.135 00:03:42 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:17.135 00:03:42 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:17.135 00:03:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2698754' 00:05:17.135 killing process with pid 2698754 00:05:17.135 00:03:42 -- common/autotest_common.sh@955 -- # kill 2698754 00:05:17.135 00:03:42 -- common/autotest_common.sh@960 -- # wait 2698754 00:05:17.394 00:05:17.394 real 0m1.616s 00:05:17.394 user 0m2.954s 00:05:17.394 sys 0m0.480s 00:05:17.394 00:03:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:17.394 00:03:42 -- common/autotest_common.sh@10 -- # set +x 00:05:17.394 ************************************ 00:05:17.394 END TEST spdkcli_tcp 00:05:17.394 ************************************ 00:05:17.653 00:03:42 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:17.653 00:03:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:17.653 00:03:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:17.653 00:03:42 -- common/autotest_common.sh@10 -- # set +x 00:05:17.653 ************************************ 00:05:17.653 START TEST dpdk_mem_utility 00:05:17.653 ************************************ 00:05:17.653 00:03:42 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:17.653 * Looking for test storage... 00:05:17.653 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:17.653 00:03:43 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:17.653 00:03:43 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:17.653 00:03:43 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:17.653 00:03:43 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:17.653 00:03:43 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:17.653 00:03:43 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:17.653 00:03:43 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:17.653 00:03:43 -- scripts/common.sh@335 -- # IFS=.-: 00:05:17.653 00:03:43 -- scripts/common.sh@335 -- # read -ra ver1 00:05:17.653 00:03:43 -- scripts/common.sh@336 -- # IFS=.-: 00:05:17.653 00:03:43 -- scripts/common.sh@336 -- # read -ra ver2 00:05:17.653 00:03:43 -- scripts/common.sh@337 -- # local 'op=<' 00:05:17.653 00:03:43 -- scripts/common.sh@339 -- # ver1_l=2 00:05:17.653 00:03:43 -- scripts/common.sh@340 -- # ver2_l=1 00:05:17.653 00:03:43 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:17.653 00:03:43 -- scripts/common.sh@343 -- # case "$op" in 00:05:17.653 00:03:43 -- scripts/common.sh@344 -- # : 1 00:05:17.653 00:03:43 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:17.653 00:03:43 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:17.653 00:03:43 -- scripts/common.sh@364 -- # decimal 1 00:05:17.653 00:03:43 -- scripts/common.sh@352 -- # local d=1 00:05:17.653 00:03:43 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:17.653 00:03:43 -- scripts/common.sh@354 -- # echo 1 00:05:17.653 00:03:43 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:17.653 00:03:43 -- scripts/common.sh@365 -- # decimal 2 00:05:17.653 00:03:43 -- scripts/common.sh@352 -- # local d=2 00:05:17.653 00:03:43 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:17.653 00:03:43 -- scripts/common.sh@354 -- # echo 2 00:05:17.653 00:03:43 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:17.653 00:03:43 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:17.653 00:03:43 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:17.653 00:03:43 -- scripts/common.sh@367 -- # return 0 00:05:17.653 00:03:43 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:17.653 00:03:43 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:17.653 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:17.653 --rc genhtml_branch_coverage=1 00:05:17.653 --rc genhtml_function_coverage=1 00:05:17.653 --rc genhtml_legend=1 00:05:17.653 --rc geninfo_all_blocks=1 00:05:17.653 --rc geninfo_unexecuted_blocks=1 00:05:17.653 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:17.653 ' 00:05:17.653 00:03:43 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:17.653 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:17.653 --rc genhtml_branch_coverage=1 00:05:17.653 --rc genhtml_function_coverage=1 00:05:17.653 --rc genhtml_legend=1 00:05:17.653 --rc geninfo_all_blocks=1 00:05:17.653 --rc geninfo_unexecuted_blocks=1 00:05:17.653 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:17.653 ' 00:05:17.653 00:03:43 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:17.653 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:17.653 --rc genhtml_branch_coverage=1 00:05:17.653 --rc genhtml_function_coverage=1 00:05:17.653 --rc genhtml_legend=1 00:05:17.653 --rc geninfo_all_blocks=1 00:05:17.653 --rc geninfo_unexecuted_blocks=1 00:05:17.653 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:17.653 ' 00:05:17.653 00:03:43 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:17.653 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:17.653 --rc genhtml_branch_coverage=1 00:05:17.653 --rc genhtml_function_coverage=1 00:05:17.653 --rc genhtml_legend=1 00:05:17.653 --rc geninfo_all_blocks=1 00:05:17.653 --rc geninfo_unexecuted_blocks=1 00:05:17.653 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:17.653 ' 00:05:17.653 00:03:43 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:17.653 00:03:43 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2699101 00:05:17.653 00:03:43 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:17.653 00:03:43 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2699101 00:05:17.653 00:03:43 -- common/autotest_common.sh@829 -- # '[' -z 2699101 ']' 00:05:17.653 00:03:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:17.653 00:03:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:17.653 00:03:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:17.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:17.653 00:03:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:17.653 00:03:43 -- common/autotest_common.sh@10 -- # set +x 00:05:17.653 [2024-11-30 00:03:43.191053] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:17.653 [2024-11-30 00:03:43.191126] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699101 ] 00:05:17.912 EAL: No free 2048 kB hugepages reported on node 1 00:05:17.912 [2024-11-30 00:03:43.258510] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:17.912 [2024-11-30 00:03:43.328419] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:17.912 [2024-11-30 00:03:43.328566] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.480 00:03:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:18.480 00:03:44 -- common/autotest_common.sh@862 -- # return 0 00:05:18.480 00:03:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:18.480 00:03:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:18.480 00:03:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:18.480 00:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:18.480 { 00:05:18.480 "filename": "/tmp/spdk_mem_dump.txt" 00:05:18.480 } 00:05:18.480 00:03:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:18.480 00:03:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:18.740 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:18.740 1 heaps totaling size 814.000000 MiB 00:05:18.740 size: 814.000000 MiB heap id: 0 00:05:18.740 end heaps---------- 00:05:18.740 8 mempools totaling size 598.116089 MiB 00:05:18.740 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:18.740 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:18.740 size: 84.521057 MiB name: bdev_io_2699101 00:05:18.740 size: 51.011292 MiB name: evtpool_2699101 00:05:18.740 size: 50.003479 MiB name: msgpool_2699101 00:05:18.740 size: 21.763794 MiB name: PDU_Pool 00:05:18.740 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:18.740 size: 0.026123 MiB name: Session_Pool 00:05:18.740 end mempools------- 00:05:18.740 6 memzones totaling size 4.142822 MiB 00:05:18.740 size: 1.000366 MiB name: RG_ring_0_2699101 00:05:18.740 size: 1.000366 MiB name: RG_ring_1_2699101 00:05:18.740 size: 1.000366 MiB name: RG_ring_4_2699101 00:05:18.740 size: 1.000366 MiB name: RG_ring_5_2699101 00:05:18.740 size: 0.125366 MiB name: RG_ring_2_2699101 00:05:18.740 size: 0.015991 MiB name: RG_ring_3_2699101 00:05:18.740 end memzones------- 00:05:18.740 00:03:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:18.740 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:18.740 list of free elements. size: 12.519348 MiB 00:05:18.740 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:18.740 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:18.740 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:18.740 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:18.740 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:18.740 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:18.740 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:18.740 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:18.740 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:18.740 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:18.740 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:18.740 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:18.740 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:18.740 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:18.740 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:18.740 list of standard malloc elements. size: 199.218079 MiB 00:05:18.740 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:18.740 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:18.740 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:18.740 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:18.740 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:18.740 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:18.740 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:18.740 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:18.740 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:18.740 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:18.740 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:18.740 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:18.740 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:18.740 list of memzone associated elements. size: 602.262573 MiB 00:05:18.740 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:18.740 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:18.740 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:18.740 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:18.740 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:18.740 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2699101_0 00:05:18.740 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:18.740 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2699101_0 00:05:18.740 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:18.740 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2699101_0 00:05:18.740 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:18.741 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:18.741 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:18.741 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:18.741 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:18.741 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2699101 00:05:18.741 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:18.741 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2699101 00:05:18.741 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:18.741 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2699101 00:05:18.741 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:18.741 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:18.741 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:18.741 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:18.741 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:18.741 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:18.741 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:18.741 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:18.741 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:18.741 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2699101 00:05:18.741 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:18.741 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2699101 00:05:18.741 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:18.741 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2699101 00:05:18.741 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:18.741 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2699101 00:05:18.741 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:18.741 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2699101 00:05:18.741 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:18.741 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:18.741 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:18.741 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:18.741 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:18.741 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:18.741 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:18.741 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2699101 00:05:18.741 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:18.741 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:18.741 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:18.741 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:18.741 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:18.741 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2699101 00:05:18.741 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:18.741 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:18.741 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:18.741 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2699101 00:05:18.741 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:18.741 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2699101 00:05:18.741 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:18.741 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:18.741 00:03:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:18.741 00:03:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2699101 00:05:18.741 00:03:44 -- common/autotest_common.sh@936 -- # '[' -z 2699101 ']' 00:05:18.741 00:03:44 -- common/autotest_common.sh@940 -- # kill -0 2699101 00:05:18.741 00:03:44 -- common/autotest_common.sh@941 -- # uname 00:05:18.741 00:03:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:18.741 00:03:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2699101 00:05:18.741 00:03:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:18.741 00:03:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:18.741 00:03:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2699101' 00:05:18.741 killing process with pid 2699101 00:05:18.741 00:03:44 -- common/autotest_common.sh@955 -- # kill 2699101 00:05:18.741 00:03:44 -- common/autotest_common.sh@960 -- # wait 2699101 00:05:19.000 00:05:19.000 real 0m1.505s 00:05:19.000 user 0m1.537s 00:05:19.000 sys 0m0.459s 00:05:19.000 00:03:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:19.000 00:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:19.000 ************************************ 00:05:19.000 END TEST dpdk_mem_utility 00:05:19.000 ************************************ 00:05:19.000 00:03:44 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:19.000 00:03:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:19.000 00:03:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.000 00:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:19.000 ************************************ 00:05:19.000 START TEST event 00:05:19.000 ************************************ 00:05:19.000 00:03:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:19.261 * Looking for test storage... 00:05:19.261 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:19.261 00:03:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:19.261 00:03:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:19.261 00:03:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:19.261 00:03:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:19.261 00:03:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:19.261 00:03:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:19.261 00:03:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:19.261 00:03:44 -- scripts/common.sh@335 -- # IFS=.-: 00:05:19.261 00:03:44 -- scripts/common.sh@335 -- # read -ra ver1 00:05:19.261 00:03:44 -- scripts/common.sh@336 -- # IFS=.-: 00:05:19.261 00:03:44 -- scripts/common.sh@336 -- # read -ra ver2 00:05:19.261 00:03:44 -- scripts/common.sh@337 -- # local 'op=<' 00:05:19.261 00:03:44 -- scripts/common.sh@339 -- # ver1_l=2 00:05:19.261 00:03:44 -- scripts/common.sh@340 -- # ver2_l=1 00:05:19.261 00:03:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:19.261 00:03:44 -- scripts/common.sh@343 -- # case "$op" in 00:05:19.261 00:03:44 -- scripts/common.sh@344 -- # : 1 00:05:19.261 00:03:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:19.261 00:03:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:19.261 00:03:44 -- scripts/common.sh@364 -- # decimal 1 00:05:19.261 00:03:44 -- scripts/common.sh@352 -- # local d=1 00:05:19.261 00:03:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:19.261 00:03:44 -- scripts/common.sh@354 -- # echo 1 00:05:19.261 00:03:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:19.261 00:03:44 -- scripts/common.sh@365 -- # decimal 2 00:05:19.261 00:03:44 -- scripts/common.sh@352 -- # local d=2 00:05:19.261 00:03:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:19.261 00:03:44 -- scripts/common.sh@354 -- # echo 2 00:05:19.261 00:03:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:19.261 00:03:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:19.261 00:03:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:19.261 00:03:44 -- scripts/common.sh@367 -- # return 0 00:05:19.261 00:03:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:19.261 00:03:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:19.261 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.261 --rc genhtml_branch_coverage=1 00:05:19.261 --rc genhtml_function_coverage=1 00:05:19.261 --rc genhtml_legend=1 00:05:19.261 --rc geninfo_all_blocks=1 00:05:19.261 --rc geninfo_unexecuted_blocks=1 00:05:19.261 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.261 ' 00:05:19.261 00:03:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:19.261 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.261 --rc genhtml_branch_coverage=1 00:05:19.261 --rc genhtml_function_coverage=1 00:05:19.261 --rc genhtml_legend=1 00:05:19.261 --rc geninfo_all_blocks=1 00:05:19.261 --rc geninfo_unexecuted_blocks=1 00:05:19.261 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.261 ' 00:05:19.261 00:03:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:19.261 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.261 --rc genhtml_branch_coverage=1 00:05:19.261 --rc genhtml_function_coverage=1 00:05:19.261 --rc genhtml_legend=1 00:05:19.261 --rc geninfo_all_blocks=1 00:05:19.261 --rc geninfo_unexecuted_blocks=1 00:05:19.261 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.261 ' 00:05:19.261 00:03:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:19.261 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.261 --rc genhtml_branch_coverage=1 00:05:19.261 --rc genhtml_function_coverage=1 00:05:19.261 --rc genhtml_legend=1 00:05:19.261 --rc geninfo_all_blocks=1 00:05:19.261 --rc geninfo_unexecuted_blocks=1 00:05:19.261 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.261 ' 00:05:19.261 00:03:44 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:19.261 00:03:44 -- bdev/nbd_common.sh@6 -- # set -e 00:05:19.261 00:03:44 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:19.261 00:03:44 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:19.261 00:03:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.261 00:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:19.261 ************************************ 00:05:19.261 START TEST event_perf 00:05:19.261 ************************************ 00:05:19.261 00:03:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:19.261 Running I/O for 1 seconds...[2024-11-30 00:03:44.731272] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:19.261 [2024-11-30 00:03:44.731359] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699434 ] 00:05:19.261 EAL: No free 2048 kB hugepages reported on node 1 00:05:19.261 [2024-11-30 00:03:44.802220] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:19.520 [2024-11-30 00:03:44.876528] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:19.520 [2024-11-30 00:03:44.876630] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:19.520 [2024-11-30 00:03:44.876717] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:19.520 [2024-11-30 00:03:44.876721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:20.456 Running I/O for 1 seconds... 00:05:20.456 lcore 0: 195805 00:05:20.456 lcore 1: 195804 00:05:20.456 lcore 2: 195804 00:05:20.456 lcore 3: 195806 00:05:20.456 done. 00:05:20.456 00:05:20.456 real 0m1.228s 00:05:20.456 user 0m4.134s 00:05:20.456 sys 0m0.090s 00:05:20.456 00:03:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:20.456 00:03:45 -- common/autotest_common.sh@10 -- # set +x 00:05:20.456 ************************************ 00:05:20.456 END TEST event_perf 00:05:20.456 ************************************ 00:05:20.456 00:03:45 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:20.456 00:03:45 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:20.456 00:03:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:20.456 00:03:45 -- common/autotest_common.sh@10 -- # set +x 00:05:20.456 ************************************ 00:05:20.456 START TEST event_reactor 00:05:20.456 ************************************ 00:05:20.456 00:03:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:20.456 [2024-11-30 00:03:46.006427] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:20.456 [2024-11-30 00:03:46.006520] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699723 ] 00:05:20.715 EAL: No free 2048 kB hugepages reported on node 1 00:05:20.715 [2024-11-30 00:03:46.076909] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:20.715 [2024-11-30 00:03:46.145360] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.093 test_start 00:05:22.093 oneshot 00:05:22.093 tick 100 00:05:22.093 tick 100 00:05:22.093 tick 250 00:05:22.093 tick 100 00:05:22.093 tick 100 00:05:22.093 tick 100 00:05:22.093 tick 250 00:05:22.093 tick 500 00:05:22.093 tick 100 00:05:22.093 tick 100 00:05:22.093 tick 250 00:05:22.093 tick 100 00:05:22.093 tick 100 00:05:22.093 test_end 00:05:22.093 00:05:22.093 real 0m1.220s 00:05:22.093 user 0m1.128s 00:05:22.093 sys 0m0.087s 00:05:22.093 00:03:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:22.093 00:03:47 -- common/autotest_common.sh@10 -- # set +x 00:05:22.093 ************************************ 00:05:22.093 END TEST event_reactor 00:05:22.093 ************************************ 00:05:22.093 00:03:47 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:22.093 00:03:47 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:22.093 00:03:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:22.093 00:03:47 -- common/autotest_common.sh@10 -- # set +x 00:05:22.093 ************************************ 00:05:22.093 START TEST event_reactor_perf 00:05:22.093 ************************************ 00:05:22.093 00:03:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:22.093 [2024-11-30 00:03:47.268760] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:22.093 [2024-11-30 00:03:47.268850] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2700006 ] 00:05:22.093 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.093 [2024-11-30 00:03:47.337754] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.093 [2024-11-30 00:03:47.406051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.029 test_start 00:05:23.029 test_end 00:05:23.029 Performance: 957480 events per second 00:05:23.029 00:05:23.029 real 0m1.221s 00:05:23.029 user 0m1.126s 00:05:23.029 sys 0m0.090s 00:05:23.029 00:03:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:23.029 00:03:48 -- common/autotest_common.sh@10 -- # set +x 00:05:23.029 ************************************ 00:05:23.029 END TEST event_reactor_perf 00:05:23.029 ************************************ 00:05:23.029 00:03:48 -- event/event.sh@49 -- # uname -s 00:05:23.029 00:03:48 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:23.029 00:03:48 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:23.029 00:03:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:23.029 00:03:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:23.029 00:03:48 -- common/autotest_common.sh@10 -- # set +x 00:05:23.029 ************************************ 00:05:23.029 START TEST event_scheduler 00:05:23.029 ************************************ 00:05:23.029 00:03:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:23.289 * Looking for test storage... 00:05:23.289 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:23.289 00:03:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:23.289 00:03:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:23.289 00:03:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:23.289 00:03:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:23.289 00:03:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:23.289 00:03:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:23.289 00:03:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:23.289 00:03:48 -- scripts/common.sh@335 -- # IFS=.-: 00:05:23.289 00:03:48 -- scripts/common.sh@335 -- # read -ra ver1 00:05:23.289 00:03:48 -- scripts/common.sh@336 -- # IFS=.-: 00:05:23.289 00:03:48 -- scripts/common.sh@336 -- # read -ra ver2 00:05:23.289 00:03:48 -- scripts/common.sh@337 -- # local 'op=<' 00:05:23.289 00:03:48 -- scripts/common.sh@339 -- # ver1_l=2 00:05:23.289 00:03:48 -- scripts/common.sh@340 -- # ver2_l=1 00:05:23.289 00:03:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:23.289 00:03:48 -- scripts/common.sh@343 -- # case "$op" in 00:05:23.289 00:03:48 -- scripts/common.sh@344 -- # : 1 00:05:23.289 00:03:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:23.289 00:03:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:23.289 00:03:48 -- scripts/common.sh@364 -- # decimal 1 00:05:23.289 00:03:48 -- scripts/common.sh@352 -- # local d=1 00:05:23.289 00:03:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:23.289 00:03:48 -- scripts/common.sh@354 -- # echo 1 00:05:23.289 00:03:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:23.289 00:03:48 -- scripts/common.sh@365 -- # decimal 2 00:05:23.289 00:03:48 -- scripts/common.sh@352 -- # local d=2 00:05:23.289 00:03:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:23.289 00:03:48 -- scripts/common.sh@354 -- # echo 2 00:05:23.289 00:03:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:23.289 00:03:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:23.289 00:03:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:23.289 00:03:48 -- scripts/common.sh@367 -- # return 0 00:05:23.289 00:03:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:23.289 00:03:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:23.289 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.289 --rc genhtml_branch_coverage=1 00:05:23.289 --rc genhtml_function_coverage=1 00:05:23.289 --rc genhtml_legend=1 00:05:23.289 --rc geninfo_all_blocks=1 00:05:23.289 --rc geninfo_unexecuted_blocks=1 00:05:23.289 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:23.289 ' 00:05:23.289 00:03:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:23.289 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.289 --rc genhtml_branch_coverage=1 00:05:23.289 --rc genhtml_function_coverage=1 00:05:23.289 --rc genhtml_legend=1 00:05:23.289 --rc geninfo_all_blocks=1 00:05:23.289 --rc geninfo_unexecuted_blocks=1 00:05:23.289 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:23.289 ' 00:05:23.289 00:03:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:23.289 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.289 --rc genhtml_branch_coverage=1 00:05:23.289 --rc genhtml_function_coverage=1 00:05:23.289 --rc genhtml_legend=1 00:05:23.289 --rc geninfo_all_blocks=1 00:05:23.289 --rc geninfo_unexecuted_blocks=1 00:05:23.289 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:23.289 ' 00:05:23.289 00:03:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:23.289 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.289 --rc genhtml_branch_coverage=1 00:05:23.289 --rc genhtml_function_coverage=1 00:05:23.290 --rc genhtml_legend=1 00:05:23.290 --rc geninfo_all_blocks=1 00:05:23.290 --rc geninfo_unexecuted_blocks=1 00:05:23.290 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:23.290 ' 00:05:23.290 00:03:48 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:23.290 00:03:48 -- scheduler/scheduler.sh@35 -- # scheduler_pid=2700329 00:05:23.290 00:03:48 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:23.290 00:03:48 -- scheduler/scheduler.sh@37 -- # waitforlisten 2700329 00:05:23.290 00:03:48 -- common/autotest_common.sh@829 -- # '[' -z 2700329 ']' 00:05:23.290 00:03:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:23.290 00:03:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:23.290 00:03:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:23.290 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:23.290 00:03:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:23.290 00:03:48 -- common/autotest_common.sh@10 -- # set +x 00:05:23.290 00:03:48 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:23.290 [2024-11-30 00:03:48.735796] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:23.290 [2024-11-30 00:03:48.735880] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2700329 ] 00:05:23.290 EAL: No free 2048 kB hugepages reported on node 1 00:05:23.290 [2024-11-30 00:03:48.801535] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:23.549 [2024-11-30 00:03:48.878503] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.549 [2024-11-30 00:03:48.878590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:23.549 [2024-11-30 00:03:48.878673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:23.549 [2024-11-30 00:03:48.878675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:24.117 00:03:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:24.117 00:03:49 -- common/autotest_common.sh@862 -- # return 0 00:05:24.117 00:03:49 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:24.117 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.117 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.117 POWER: Env isn't set yet! 00:05:24.117 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:24.117 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:24.117 POWER: Cannot set governor of lcore 0 to userspace 00:05:24.117 POWER: Attempting to initialise PSTAT power management... 00:05:24.117 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:24.117 POWER: Initialized successfully for lcore 0 power management 00:05:24.117 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:24.118 POWER: Initialized successfully for lcore 1 power management 00:05:24.118 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:24.118 POWER: Initialized successfully for lcore 2 power management 00:05:24.118 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:24.118 POWER: Initialized successfully for lcore 3 power management 00:05:24.118 [2024-11-30 00:03:49.630774] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:24.118 [2024-11-30 00:03:49.630790] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:24.118 [2024-11-30 00:03:49.630800] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:24.118 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.118 00:03:49 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:24.118 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.118 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.375 [2024-11-30 00:03:49.699102] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:24.375 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.375 00:03:49 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:24.375 00:03:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:24.375 00:03:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:24.375 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.375 ************************************ 00:05:24.375 START TEST scheduler_create_thread 00:05:24.376 ************************************ 00:05:24.376 00:03:49 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 2 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 3 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 4 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 5 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 6 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 7 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 8 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 9 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 10 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:24.376 00:03:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:24.376 00:03:49 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:24.376 00:03:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:24.376 00:03:49 -- common/autotest_common.sh@10 -- # set +x 00:05:25.336 00:03:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:25.336 00:03:50 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:25.336 00:03:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:25.336 00:03:50 -- common/autotest_common.sh@10 -- # set +x 00:05:26.713 00:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.713 00:03:52 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:26.713 00:03:52 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:26.713 00:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.713 00:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:27.648 00:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.648 00:05:27.648 real 0m3.383s 00:05:27.648 user 0m0.022s 00:05:27.648 sys 0m0.009s 00:05:27.648 00:03:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.648 00:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:27.648 ************************************ 00:05:27.648 END TEST scheduler_create_thread 00:05:27.648 ************************************ 00:05:27.648 00:03:53 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:27.648 00:03:53 -- scheduler/scheduler.sh@46 -- # killprocess 2700329 00:05:27.648 00:03:53 -- common/autotest_common.sh@936 -- # '[' -z 2700329 ']' 00:05:27.648 00:03:53 -- common/autotest_common.sh@940 -- # kill -0 2700329 00:05:27.648 00:03:53 -- common/autotest_common.sh@941 -- # uname 00:05:27.648 00:03:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:27.648 00:03:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2700329 00:05:27.648 00:03:53 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:27.648 00:03:53 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:27.649 00:03:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2700329' 00:05:27.649 killing process with pid 2700329 00:05:27.649 00:03:53 -- common/autotest_common.sh@955 -- # kill 2700329 00:05:27.649 00:03:53 -- common/autotest_common.sh@960 -- # wait 2700329 00:05:28.217 [2024-11-30 00:03:53.471003] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:28.217 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:28.217 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:28.217 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:28.217 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:28.217 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:28.217 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:28.217 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:28.217 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:28.217 00:05:28.217 real 0m5.164s 00:05:28.217 user 0m10.630s 00:05:28.217 sys 0m0.431s 00:05:28.217 00:03:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:28.217 00:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:28.217 ************************************ 00:05:28.217 END TEST event_scheduler 00:05:28.217 ************************************ 00:05:28.217 00:03:53 -- event/event.sh@51 -- # modprobe -n nbd 00:05:28.217 00:03:53 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:28.217 00:03:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:28.217 00:03:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:28.217 00:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:28.217 ************************************ 00:05:28.217 START TEST app_repeat 00:05:28.217 ************************************ 00:05:28.217 00:03:53 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:28.217 00:03:53 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.217 00:03:53 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:28.217 00:03:53 -- event/event.sh@13 -- # local nbd_list 00:05:28.217 00:03:53 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:28.217 00:03:53 -- event/event.sh@14 -- # local bdev_list 00:05:28.217 00:03:53 -- event/event.sh@15 -- # local repeat_times=4 00:05:28.217 00:03:53 -- event/event.sh@17 -- # modprobe nbd 00:05:28.217 00:03:53 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:28.217 00:03:53 -- event/event.sh@19 -- # repeat_pid=2701194 00:05:28.217 00:03:53 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:28.217 00:03:53 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2701194' 00:05:28.217 Process app_repeat pid: 2701194 00:05:28.217 00:03:53 -- event/event.sh@23 -- # for i in {0..2} 00:05:28.217 00:03:53 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:28.217 spdk_app_start Round 0 00:05:28.217 00:03:53 -- event/event.sh@25 -- # waitforlisten 2701194 /var/tmp/spdk-nbd.sock 00:05:28.217 00:03:53 -- common/autotest_common.sh@829 -- # '[' -z 2701194 ']' 00:05:28.217 00:03:53 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:28.217 00:03:53 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:28.217 00:03:53 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:28.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:28.217 00:03:53 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:28.217 00:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:28.217 [2024-11-30 00:03:53.769544] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:28.217 [2024-11-30 00:03:53.769616] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701194 ] 00:05:28.476 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.476 [2024-11-30 00:03:53.835605] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:28.476 [2024-11-30 00:03:53.911425] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:28.476 [2024-11-30 00:03:53.911428] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.435 00:03:54 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:29.435 00:03:54 -- common/autotest_common.sh@862 -- # return 0 00:05:29.435 00:03:54 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:29.435 Malloc0 00:05:29.435 00:03:54 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:29.435 Malloc1 00:05:29.714 00:03:55 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@12 -- # local i 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:29.714 /dev/nbd0 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:29.714 00:03:55 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:29.714 00:03:55 -- common/autotest_common.sh@867 -- # local i 00:05:29.714 00:03:55 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:29.714 00:03:55 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:29.714 00:03:55 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:29.714 00:03:55 -- common/autotest_common.sh@871 -- # break 00:05:29.714 00:03:55 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:29.714 00:03:55 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:29.714 00:03:55 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:29.714 1+0 records in 00:05:29.714 1+0 records out 00:05:29.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273857 s, 15.0 MB/s 00:05:29.714 00:03:55 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:29.714 00:03:55 -- common/autotest_common.sh@884 -- # size=4096 00:05:29.714 00:03:55 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:29.714 00:03:55 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:29.714 00:03:55 -- common/autotest_common.sh@887 -- # return 0 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:29.714 00:03:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:29.973 /dev/nbd1 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:29.973 00:03:55 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:29.973 00:03:55 -- common/autotest_common.sh@867 -- # local i 00:05:29.973 00:03:55 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:29.973 00:03:55 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:29.973 00:03:55 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:29.973 00:03:55 -- common/autotest_common.sh@871 -- # break 00:05:29.973 00:03:55 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:29.973 00:03:55 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:29.973 00:03:55 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:29.973 1+0 records in 00:05:29.973 1+0 records out 00:05:29.973 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000175893 s, 23.3 MB/s 00:05:29.973 00:03:55 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:29.973 00:03:55 -- common/autotest_common.sh@884 -- # size=4096 00:05:29.973 00:03:55 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:29.973 00:03:55 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:29.973 00:03:55 -- common/autotest_common.sh@887 -- # return 0 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:29.973 00:03:55 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:30.232 { 00:05:30.232 "nbd_device": "/dev/nbd0", 00:05:30.232 "bdev_name": "Malloc0" 00:05:30.232 }, 00:05:30.232 { 00:05:30.232 "nbd_device": "/dev/nbd1", 00:05:30.232 "bdev_name": "Malloc1" 00:05:30.232 } 00:05:30.232 ]' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:30.232 { 00:05:30.232 "nbd_device": "/dev/nbd0", 00:05:30.232 "bdev_name": "Malloc0" 00:05:30.232 }, 00:05:30.232 { 00:05:30.232 "nbd_device": "/dev/nbd1", 00:05:30.232 "bdev_name": "Malloc1" 00:05:30.232 } 00:05:30.232 ]' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:30.232 /dev/nbd1' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:30.232 /dev/nbd1' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@65 -- # count=2 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@95 -- # count=2 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:30.232 256+0 records in 00:05:30.232 256+0 records out 00:05:30.232 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115894 s, 90.5 MB/s 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:30.232 256+0 records in 00:05:30.232 256+0 records out 00:05:30.232 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197931 s, 53.0 MB/s 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:30.232 256+0 records in 00:05:30.232 256+0 records out 00:05:30.232 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021061 s, 49.8 MB/s 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@51 -- # local i 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:30.232 00:03:55 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@41 -- # break 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@45 -- # return 0 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:30.491 00:03:55 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@41 -- # break 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@45 -- # return 0 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:30.750 00:03:56 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@65 -- # true 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@65 -- # count=0 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@104 -- # count=0 00:05:31.019 00:03:56 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:31.020 00:03:56 -- bdev/nbd_common.sh@109 -- # return 0 00:05:31.020 00:03:56 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:31.281 00:03:56 -- event/event.sh@35 -- # sleep 3 00:05:31.281 [2024-11-30 00:03:56.760562] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:31.281 [2024-11-30 00:03:56.824593] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:31.281 [2024-11-30 00:03:56.824595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:31.540 [2024-11-30 00:03:56.865470] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:31.540 [2024-11-30 00:03:56.865516] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:34.083 00:03:59 -- event/event.sh@23 -- # for i in {0..2} 00:05:34.083 00:03:59 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:34.083 spdk_app_start Round 1 00:05:34.083 00:03:59 -- event/event.sh@25 -- # waitforlisten 2701194 /var/tmp/spdk-nbd.sock 00:05:34.083 00:03:59 -- common/autotest_common.sh@829 -- # '[' -z 2701194 ']' 00:05:34.083 00:03:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:34.083 00:03:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:34.083 00:03:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:34.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:34.083 00:03:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:34.083 00:03:59 -- common/autotest_common.sh@10 -- # set +x 00:05:34.341 00:03:59 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:34.341 00:03:59 -- common/autotest_common.sh@862 -- # return 0 00:05:34.341 00:03:59 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:34.600 Malloc0 00:05:34.600 00:03:59 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:34.600 Malloc1 00:05:34.600 00:04:00 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@12 -- # local i 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:34.600 00:04:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:34.858 /dev/nbd0 00:05:34.858 00:04:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:34.858 00:04:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:34.858 00:04:00 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:34.858 00:04:00 -- common/autotest_common.sh@867 -- # local i 00:05:34.858 00:04:00 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:34.858 00:04:00 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:34.858 00:04:00 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:34.858 00:04:00 -- common/autotest_common.sh@871 -- # break 00:05:34.858 00:04:00 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:34.858 00:04:00 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:34.858 00:04:00 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:34.858 1+0 records in 00:05:34.858 1+0 records out 00:05:34.858 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021655 s, 18.9 MB/s 00:05:34.858 00:04:00 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:34.858 00:04:00 -- common/autotest_common.sh@884 -- # size=4096 00:05:34.858 00:04:00 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:34.858 00:04:00 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:34.858 00:04:00 -- common/autotest_common.sh@887 -- # return 0 00:05:34.858 00:04:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:34.858 00:04:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:34.858 00:04:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:35.116 /dev/nbd1 00:05:35.116 00:04:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:35.116 00:04:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:35.116 00:04:00 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:35.116 00:04:00 -- common/autotest_common.sh@867 -- # local i 00:05:35.116 00:04:00 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:35.116 00:04:00 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:35.117 00:04:00 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:35.117 00:04:00 -- common/autotest_common.sh@871 -- # break 00:05:35.117 00:04:00 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:35.117 00:04:00 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:35.117 00:04:00 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:35.117 1+0 records in 00:05:35.117 1+0 records out 00:05:35.117 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241523 s, 17.0 MB/s 00:05:35.117 00:04:00 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:35.117 00:04:00 -- common/autotest_common.sh@884 -- # size=4096 00:05:35.117 00:04:00 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:35.117 00:04:00 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:35.117 00:04:00 -- common/autotest_common.sh@887 -- # return 0 00:05:35.117 00:04:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:35.117 00:04:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:35.117 00:04:00 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:35.117 00:04:00 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:35.117 00:04:00 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:35.376 { 00:05:35.376 "nbd_device": "/dev/nbd0", 00:05:35.376 "bdev_name": "Malloc0" 00:05:35.376 }, 00:05:35.376 { 00:05:35.376 "nbd_device": "/dev/nbd1", 00:05:35.376 "bdev_name": "Malloc1" 00:05:35.376 } 00:05:35.376 ]' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:35.376 { 00:05:35.376 "nbd_device": "/dev/nbd0", 00:05:35.376 "bdev_name": "Malloc0" 00:05:35.376 }, 00:05:35.376 { 00:05:35.376 "nbd_device": "/dev/nbd1", 00:05:35.376 "bdev_name": "Malloc1" 00:05:35.376 } 00:05:35.376 ]' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:35.376 /dev/nbd1' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:35.376 /dev/nbd1' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@65 -- # count=2 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@95 -- # count=2 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:35.376 256+0 records in 00:05:35.376 256+0 records out 00:05:35.376 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104486 s, 100 MB/s 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:35.376 256+0 records in 00:05:35.376 256+0 records out 00:05:35.376 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019366 s, 54.1 MB/s 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:35.376 256+0 records in 00:05:35.376 256+0 records out 00:05:35.376 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210971 s, 49.7 MB/s 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@51 -- # local i 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:35.376 00:04:00 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@41 -- # break 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@45 -- # return 0 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:35.635 00:04:01 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@41 -- # break 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@45 -- # return 0 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:35.894 00:04:01 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@65 -- # true 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@65 -- # count=0 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@104 -- # count=0 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:36.153 00:04:01 -- bdev/nbd_common.sh@109 -- # return 0 00:05:36.153 00:04:01 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:36.153 00:04:01 -- event/event.sh@35 -- # sleep 3 00:05:36.413 [2024-11-30 00:04:01.850814] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:36.413 [2024-11-30 00:04:01.915516] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:36.413 [2024-11-30 00:04:01.915519] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.413 [2024-11-30 00:04:01.956525] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:36.413 [2024-11-30 00:04:01.956570] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:39.700 00:04:04 -- event/event.sh@23 -- # for i in {0..2} 00:05:39.700 00:04:04 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:39.700 spdk_app_start Round 2 00:05:39.700 00:04:04 -- event/event.sh@25 -- # waitforlisten 2701194 /var/tmp/spdk-nbd.sock 00:05:39.700 00:04:04 -- common/autotest_common.sh@829 -- # '[' -z 2701194 ']' 00:05:39.700 00:04:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:39.700 00:04:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:39.700 00:04:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:39.700 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:39.700 00:04:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:39.700 00:04:04 -- common/autotest_common.sh@10 -- # set +x 00:05:39.700 00:04:04 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:39.700 00:04:04 -- common/autotest_common.sh@862 -- # return 0 00:05:39.700 00:04:04 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:39.700 Malloc0 00:05:39.700 00:04:05 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:39.700 Malloc1 00:05:39.700 00:04:05 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@12 -- # local i 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:39.700 00:04:05 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:39.957 /dev/nbd0 00:05:39.957 00:04:05 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:39.957 00:04:05 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:39.957 00:04:05 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:39.957 00:04:05 -- common/autotest_common.sh@867 -- # local i 00:05:39.957 00:04:05 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:39.957 00:04:05 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:39.957 00:04:05 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:39.957 00:04:05 -- common/autotest_common.sh@871 -- # break 00:05:39.957 00:04:05 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:39.957 00:04:05 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:39.957 00:04:05 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:39.957 1+0 records in 00:05:39.957 1+0 records out 00:05:39.957 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268497 s, 15.3 MB/s 00:05:39.957 00:04:05 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:39.957 00:04:05 -- common/autotest_common.sh@884 -- # size=4096 00:05:39.957 00:04:05 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:39.957 00:04:05 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:39.957 00:04:05 -- common/autotest_common.sh@887 -- # return 0 00:05:39.957 00:04:05 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:39.957 00:04:05 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:39.957 00:04:05 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:40.216 /dev/nbd1 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:40.216 00:04:05 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:40.216 00:04:05 -- common/autotest_common.sh@867 -- # local i 00:05:40.216 00:04:05 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:40.216 00:04:05 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:40.216 00:04:05 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:40.216 00:04:05 -- common/autotest_common.sh@871 -- # break 00:05:40.216 00:04:05 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:40.216 00:04:05 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:40.216 00:04:05 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:40.216 1+0 records in 00:05:40.216 1+0 records out 00:05:40.216 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225303 s, 18.2 MB/s 00:05:40.216 00:04:05 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:40.216 00:04:05 -- common/autotest_common.sh@884 -- # size=4096 00:05:40.216 00:04:05 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:40.216 00:04:05 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:40.216 00:04:05 -- common/autotest_common.sh@887 -- # return 0 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:40.216 00:04:05 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:40.476 { 00:05:40.476 "nbd_device": "/dev/nbd0", 00:05:40.476 "bdev_name": "Malloc0" 00:05:40.476 }, 00:05:40.476 { 00:05:40.476 "nbd_device": "/dev/nbd1", 00:05:40.476 "bdev_name": "Malloc1" 00:05:40.476 } 00:05:40.476 ]' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:40.476 { 00:05:40.476 "nbd_device": "/dev/nbd0", 00:05:40.476 "bdev_name": "Malloc0" 00:05:40.476 }, 00:05:40.476 { 00:05:40.476 "nbd_device": "/dev/nbd1", 00:05:40.476 "bdev_name": "Malloc1" 00:05:40.476 } 00:05:40.476 ]' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:40.476 /dev/nbd1' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:40.476 /dev/nbd1' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@65 -- # count=2 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@95 -- # count=2 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:40.476 256+0 records in 00:05:40.476 256+0 records out 00:05:40.476 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106919 s, 98.1 MB/s 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:40.476 256+0 records in 00:05:40.476 256+0 records out 00:05:40.476 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199217 s, 52.6 MB/s 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:40.476 256+0 records in 00:05:40.476 256+0 records out 00:05:40.476 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211766 s, 49.5 MB/s 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@51 -- # local i 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:40.476 00:04:05 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@41 -- # break 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@45 -- # return 0 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:40.735 00:04:06 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@41 -- # break 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@45 -- # return 0 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:40.994 00:04:06 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@65 -- # true 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@65 -- # count=0 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@104 -- # count=0 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:41.254 00:04:06 -- bdev/nbd_common.sh@109 -- # return 0 00:05:41.254 00:04:06 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:41.254 00:04:06 -- event/event.sh@35 -- # sleep 3 00:05:41.513 [2024-11-30 00:04:06.975192] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:41.513 [2024-11-30 00:04:07.039715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:41.513 [2024-11-30 00:04:07.039721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.772 [2024-11-30 00:04:07.080592] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:41.772 [2024-11-30 00:04:07.080636] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:44.317 00:04:09 -- event/event.sh@38 -- # waitforlisten 2701194 /var/tmp/spdk-nbd.sock 00:05:44.317 00:04:09 -- common/autotest_common.sh@829 -- # '[' -z 2701194 ']' 00:05:44.317 00:04:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:44.317 00:04:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:44.317 00:04:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:44.317 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:44.317 00:04:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:44.317 00:04:09 -- common/autotest_common.sh@10 -- # set +x 00:05:44.575 00:04:09 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:44.575 00:04:09 -- common/autotest_common.sh@862 -- # return 0 00:05:44.575 00:04:09 -- event/event.sh@39 -- # killprocess 2701194 00:05:44.575 00:04:09 -- common/autotest_common.sh@936 -- # '[' -z 2701194 ']' 00:05:44.575 00:04:09 -- common/autotest_common.sh@940 -- # kill -0 2701194 00:05:44.575 00:04:09 -- common/autotest_common.sh@941 -- # uname 00:05:44.575 00:04:09 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:44.575 00:04:09 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2701194 00:05:44.575 00:04:10 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:44.575 00:04:10 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:44.575 00:04:10 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2701194' 00:05:44.575 killing process with pid 2701194 00:05:44.575 00:04:10 -- common/autotest_common.sh@955 -- # kill 2701194 00:05:44.575 00:04:10 -- common/autotest_common.sh@960 -- # wait 2701194 00:05:44.833 spdk_app_start is called in Round 0. 00:05:44.833 Shutdown signal received, stop current app iteration 00:05:44.833 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:44.833 spdk_app_start is called in Round 1. 00:05:44.833 Shutdown signal received, stop current app iteration 00:05:44.833 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:44.833 spdk_app_start is called in Round 2. 00:05:44.833 Shutdown signal received, stop current app iteration 00:05:44.833 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:44.833 spdk_app_start is called in Round 3. 00:05:44.833 Shutdown signal received, stop current app iteration 00:05:44.833 00:04:10 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:44.834 00:04:10 -- event/event.sh@42 -- # return 0 00:05:44.834 00:05:44.834 real 0m16.455s 00:05:44.834 user 0m35.093s 00:05:44.834 sys 0m3.072s 00:05:44.834 00:04:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.834 00:04:10 -- common/autotest_common.sh@10 -- # set +x 00:05:44.834 ************************************ 00:05:44.834 END TEST app_repeat 00:05:44.834 ************************************ 00:05:44.834 00:04:10 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:44.834 00:04:10 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:44.834 00:04:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.834 00:04:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.834 00:04:10 -- common/autotest_common.sh@10 -- # set +x 00:05:44.834 ************************************ 00:05:44.834 START TEST cpu_locks 00:05:44.834 ************************************ 00:05:44.834 00:04:10 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:44.834 * Looking for test storage... 00:05:44.834 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:44.834 00:04:10 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:44.834 00:04:10 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:44.834 00:04:10 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:45.092 00:04:10 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:45.093 00:04:10 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:45.093 00:04:10 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:45.093 00:04:10 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:45.093 00:04:10 -- scripts/common.sh@335 -- # IFS=.-: 00:05:45.093 00:04:10 -- scripts/common.sh@335 -- # read -ra ver1 00:05:45.093 00:04:10 -- scripts/common.sh@336 -- # IFS=.-: 00:05:45.093 00:04:10 -- scripts/common.sh@336 -- # read -ra ver2 00:05:45.093 00:04:10 -- scripts/common.sh@337 -- # local 'op=<' 00:05:45.093 00:04:10 -- scripts/common.sh@339 -- # ver1_l=2 00:05:45.093 00:04:10 -- scripts/common.sh@340 -- # ver2_l=1 00:05:45.093 00:04:10 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:45.093 00:04:10 -- scripts/common.sh@343 -- # case "$op" in 00:05:45.093 00:04:10 -- scripts/common.sh@344 -- # : 1 00:05:45.093 00:04:10 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:45.093 00:04:10 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:45.093 00:04:10 -- scripts/common.sh@364 -- # decimal 1 00:05:45.093 00:04:10 -- scripts/common.sh@352 -- # local d=1 00:05:45.093 00:04:10 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:45.093 00:04:10 -- scripts/common.sh@354 -- # echo 1 00:05:45.093 00:04:10 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:45.093 00:04:10 -- scripts/common.sh@365 -- # decimal 2 00:05:45.093 00:04:10 -- scripts/common.sh@352 -- # local d=2 00:05:45.093 00:04:10 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:45.093 00:04:10 -- scripts/common.sh@354 -- # echo 2 00:05:45.093 00:04:10 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:45.093 00:04:10 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:45.093 00:04:10 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:45.093 00:04:10 -- scripts/common.sh@367 -- # return 0 00:05:45.093 00:04:10 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:45.093 00:04:10 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:45.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.093 --rc genhtml_branch_coverage=1 00:05:45.093 --rc genhtml_function_coverage=1 00:05:45.093 --rc genhtml_legend=1 00:05:45.093 --rc geninfo_all_blocks=1 00:05:45.093 --rc geninfo_unexecuted_blocks=1 00:05:45.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.093 ' 00:05:45.093 00:04:10 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:45.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.093 --rc genhtml_branch_coverage=1 00:05:45.093 --rc genhtml_function_coverage=1 00:05:45.093 --rc genhtml_legend=1 00:05:45.093 --rc geninfo_all_blocks=1 00:05:45.093 --rc geninfo_unexecuted_blocks=1 00:05:45.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.093 ' 00:05:45.093 00:04:10 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:45.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.093 --rc genhtml_branch_coverage=1 00:05:45.093 --rc genhtml_function_coverage=1 00:05:45.093 --rc genhtml_legend=1 00:05:45.093 --rc geninfo_all_blocks=1 00:05:45.093 --rc geninfo_unexecuted_blocks=1 00:05:45.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.093 ' 00:05:45.093 00:04:10 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:45.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.093 --rc genhtml_branch_coverage=1 00:05:45.093 --rc genhtml_function_coverage=1 00:05:45.093 --rc genhtml_legend=1 00:05:45.093 --rc geninfo_all_blocks=1 00:05:45.093 --rc geninfo_unexecuted_blocks=1 00:05:45.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.093 ' 00:05:45.093 00:04:10 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:45.093 00:04:10 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:45.093 00:04:10 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:45.093 00:04:10 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:45.093 00:04:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.093 00:04:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.093 00:04:10 -- common/autotest_common.sh@10 -- # set +x 00:05:45.093 ************************************ 00:05:45.093 START TEST default_locks 00:05:45.093 ************************************ 00:05:45.093 00:04:10 -- common/autotest_common.sh@1114 -- # default_locks 00:05:45.093 00:04:10 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=2704399 00:05:45.093 00:04:10 -- event/cpu_locks.sh@47 -- # waitforlisten 2704399 00:05:45.093 00:04:10 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:45.093 00:04:10 -- common/autotest_common.sh@829 -- # '[' -z 2704399 ']' 00:05:45.093 00:04:10 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:45.093 00:04:10 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:45.093 00:04:10 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:45.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:45.093 00:04:10 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:45.093 00:04:10 -- common/autotest_common.sh@10 -- # set +x 00:05:45.093 [2024-11-30 00:04:10.470074] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:45.093 [2024-11-30 00:04:10.470141] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2704399 ] 00:05:45.093 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.093 [2024-11-30 00:04:10.536464] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.093 [2024-11-30 00:04:10.606320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:45.093 [2024-11-30 00:04:10.606467] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.031 00:04:11 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:46.031 00:04:11 -- common/autotest_common.sh@862 -- # return 0 00:05:46.031 00:04:11 -- event/cpu_locks.sh@49 -- # locks_exist 2704399 00:05:46.031 00:04:11 -- event/cpu_locks.sh@22 -- # lslocks -p 2704399 00:05:46.031 00:04:11 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:46.600 lslocks: write error 00:05:46.600 00:04:11 -- event/cpu_locks.sh@50 -- # killprocess 2704399 00:05:46.600 00:04:11 -- common/autotest_common.sh@936 -- # '[' -z 2704399 ']' 00:05:46.600 00:04:11 -- common/autotest_common.sh@940 -- # kill -0 2704399 00:05:46.600 00:04:11 -- common/autotest_common.sh@941 -- # uname 00:05:46.600 00:04:11 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:46.600 00:04:11 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2704399 00:05:46.600 00:04:11 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:46.600 00:04:11 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:46.600 00:04:11 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2704399' 00:05:46.600 killing process with pid 2704399 00:05:46.600 00:04:11 -- common/autotest_common.sh@955 -- # kill 2704399 00:05:46.600 00:04:11 -- common/autotest_common.sh@960 -- # wait 2704399 00:05:46.858 00:04:12 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 2704399 00:05:46.859 00:04:12 -- common/autotest_common.sh@650 -- # local es=0 00:05:46.859 00:04:12 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 2704399 00:05:46.859 00:04:12 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:46.859 00:04:12 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:46.859 00:04:12 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:46.859 00:04:12 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:46.859 00:04:12 -- common/autotest_common.sh@653 -- # waitforlisten 2704399 00:05:46.859 00:04:12 -- common/autotest_common.sh@829 -- # '[' -z 2704399 ']' 00:05:46.859 00:04:12 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:46.859 00:04:12 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:46.859 00:04:12 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:46.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:46.859 00:04:12 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:46.859 00:04:12 -- common/autotest_common.sh@10 -- # set +x 00:05:46.859 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (2704399) - No such process 00:05:46.859 ERROR: process (pid: 2704399) is no longer running 00:05:46.859 00:04:12 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:46.859 00:04:12 -- common/autotest_common.sh@862 -- # return 1 00:05:46.859 00:04:12 -- common/autotest_common.sh@653 -- # es=1 00:05:46.859 00:04:12 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:46.859 00:04:12 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:46.859 00:04:12 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:46.859 00:04:12 -- event/cpu_locks.sh@54 -- # no_locks 00:05:46.859 00:04:12 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:46.859 00:04:12 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:46.859 00:04:12 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:46.859 00:05:46.859 real 0m1.844s 00:05:46.859 user 0m1.960s 00:05:46.859 sys 0m0.707s 00:05:46.859 00:04:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.859 00:04:12 -- common/autotest_common.sh@10 -- # set +x 00:05:46.859 ************************************ 00:05:46.859 END TEST default_locks 00:05:46.859 ************************************ 00:05:46.859 00:04:12 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:46.859 00:04:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.859 00:04:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.859 00:04:12 -- common/autotest_common.sh@10 -- # set +x 00:05:46.859 ************************************ 00:05:46.859 START TEST default_locks_via_rpc 00:05:46.859 ************************************ 00:05:46.859 00:04:12 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:05:46.859 00:04:12 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=2704705 00:05:46.859 00:04:12 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:46.859 00:04:12 -- event/cpu_locks.sh@63 -- # waitforlisten 2704705 00:05:46.859 00:04:12 -- common/autotest_common.sh@829 -- # '[' -z 2704705 ']' 00:05:46.859 00:04:12 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:46.859 00:04:12 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:46.859 00:04:12 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:46.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:46.859 00:04:12 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:46.859 00:04:12 -- common/autotest_common.sh@10 -- # set +x 00:05:46.859 [2024-11-30 00:04:12.362098] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:46.859 [2024-11-30 00:04:12.362166] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2704705 ] 00:05:46.859 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.122 [2024-11-30 00:04:12.430423] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.122 [2024-11-30 00:04:12.505651] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:47.122 [2024-11-30 00:04:12.505801] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.691 00:04:13 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:47.691 00:04:13 -- common/autotest_common.sh@862 -- # return 0 00:05:47.691 00:04:13 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:47.691 00:04:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.691 00:04:13 -- common/autotest_common.sh@10 -- # set +x 00:05:47.691 00:04:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.691 00:04:13 -- event/cpu_locks.sh@67 -- # no_locks 00:05:47.691 00:04:13 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:47.691 00:04:13 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:47.691 00:04:13 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:47.691 00:04:13 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:47.691 00:04:13 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.691 00:04:13 -- common/autotest_common.sh@10 -- # set +x 00:05:47.691 00:04:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.691 00:04:13 -- event/cpu_locks.sh@71 -- # locks_exist 2704705 00:05:47.691 00:04:13 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:47.691 00:04:13 -- event/cpu_locks.sh@22 -- # lslocks -p 2704705 00:05:48.258 00:04:13 -- event/cpu_locks.sh@73 -- # killprocess 2704705 00:05:48.258 00:04:13 -- common/autotest_common.sh@936 -- # '[' -z 2704705 ']' 00:05:48.258 00:04:13 -- common/autotest_common.sh@940 -- # kill -0 2704705 00:05:48.258 00:04:13 -- common/autotest_common.sh@941 -- # uname 00:05:48.258 00:04:13 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:48.258 00:04:13 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2704705 00:05:48.258 00:04:13 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:48.258 00:04:13 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:48.258 00:04:13 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2704705' 00:05:48.258 killing process with pid 2704705 00:05:48.258 00:04:13 -- common/autotest_common.sh@955 -- # kill 2704705 00:05:48.258 00:04:13 -- common/autotest_common.sh@960 -- # wait 2704705 00:05:48.517 00:05:48.517 real 0m1.557s 00:05:48.517 user 0m1.624s 00:05:48.517 sys 0m0.534s 00:05:48.517 00:04:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:48.517 00:04:13 -- common/autotest_common.sh@10 -- # set +x 00:05:48.517 ************************************ 00:05:48.517 END TEST default_locks_via_rpc 00:05:48.517 ************************************ 00:05:48.517 00:04:13 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:48.517 00:04:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:48.517 00:04:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:48.517 00:04:13 -- common/autotest_common.sh@10 -- # set +x 00:05:48.517 ************************************ 00:05:48.517 START TEST non_locking_app_on_locked_coremask 00:05:48.517 ************************************ 00:05:48.517 00:04:13 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:05:48.517 00:04:13 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=2705021 00:05:48.517 00:04:13 -- event/cpu_locks.sh@81 -- # waitforlisten 2705021 /var/tmp/spdk.sock 00:05:48.517 00:04:13 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:48.517 00:04:13 -- common/autotest_common.sh@829 -- # '[' -z 2705021 ']' 00:05:48.517 00:04:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.517 00:04:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.517 00:04:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.517 00:04:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.517 00:04:13 -- common/autotest_common.sh@10 -- # set +x 00:05:48.517 [2024-11-30 00:04:13.964761] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:48.517 [2024-11-30 00:04:13.964831] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2705021 ] 00:05:48.517 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.517 [2024-11-30 00:04:14.032627] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.787 [2024-11-30 00:04:14.107498] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.787 [2024-11-30 00:04:14.107674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.357 00:04:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.357 00:04:14 -- common/autotest_common.sh@862 -- # return 0 00:05:49.357 00:04:14 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:49.357 00:04:14 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=2705271 00:05:49.357 00:04:14 -- event/cpu_locks.sh@85 -- # waitforlisten 2705271 /var/tmp/spdk2.sock 00:05:49.357 00:04:14 -- common/autotest_common.sh@829 -- # '[' -z 2705271 ']' 00:05:49.357 00:04:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:49.357 00:04:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:49.357 00:04:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:49.357 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:49.357 00:04:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:49.357 00:04:14 -- common/autotest_common.sh@10 -- # set +x 00:05:49.357 [2024-11-30 00:04:14.807163] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:49.357 [2024-11-30 00:04:14.807211] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2705271 ] 00:05:49.357 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.357 [2024-11-30 00:04:14.896602] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:49.357 [2024-11-30 00:04:14.896630] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.617 [2024-11-30 00:04:15.042452] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:49.617 [2024-11-30 00:04:15.042594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.184 00:04:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:50.184 00:04:15 -- common/autotest_common.sh@862 -- # return 0 00:05:50.184 00:04:15 -- event/cpu_locks.sh@87 -- # locks_exist 2705021 00:05:50.184 00:04:15 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:50.184 00:04:15 -- event/cpu_locks.sh@22 -- # lslocks -p 2705021 00:05:51.561 lslocks: write error 00:05:51.561 00:04:16 -- event/cpu_locks.sh@89 -- # killprocess 2705021 00:05:51.561 00:04:16 -- common/autotest_common.sh@936 -- # '[' -z 2705021 ']' 00:05:51.561 00:04:16 -- common/autotest_common.sh@940 -- # kill -0 2705021 00:05:51.561 00:04:16 -- common/autotest_common.sh@941 -- # uname 00:05:51.561 00:04:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:51.561 00:04:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2705021 00:05:51.561 00:04:16 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:51.561 00:04:16 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:51.561 00:04:16 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2705021' 00:05:51.561 killing process with pid 2705021 00:05:51.561 00:04:16 -- common/autotest_common.sh@955 -- # kill 2705021 00:05:51.561 00:04:16 -- common/autotest_common.sh@960 -- # wait 2705021 00:05:52.130 00:04:17 -- event/cpu_locks.sh@90 -- # killprocess 2705271 00:05:52.130 00:04:17 -- common/autotest_common.sh@936 -- # '[' -z 2705271 ']' 00:05:52.130 00:04:17 -- common/autotest_common.sh@940 -- # kill -0 2705271 00:05:52.130 00:04:17 -- common/autotest_common.sh@941 -- # uname 00:05:52.130 00:04:17 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:52.130 00:04:17 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2705271 00:05:52.130 00:04:17 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:52.130 00:04:17 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:52.130 00:04:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2705271' 00:05:52.130 killing process with pid 2705271 00:05:52.130 00:04:17 -- common/autotest_common.sh@955 -- # kill 2705271 00:05:52.130 00:04:17 -- common/autotest_common.sh@960 -- # wait 2705271 00:05:52.390 00:05:52.390 real 0m4.000s 00:05:52.390 user 0m4.287s 00:05:52.390 sys 0m1.311s 00:05:52.390 00:04:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:52.390 00:04:17 -- common/autotest_common.sh@10 -- # set +x 00:05:52.390 ************************************ 00:05:52.390 END TEST non_locking_app_on_locked_coremask 00:05:52.390 ************************************ 00:05:52.657 00:04:17 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:52.657 00:04:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:52.657 00:04:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.657 00:04:17 -- common/autotest_common.sh@10 -- # set +x 00:05:52.657 ************************************ 00:05:52.657 START TEST locking_app_on_unlocked_coremask 00:05:52.657 ************************************ 00:05:52.657 00:04:17 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:05:52.657 00:04:17 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=2705844 00:05:52.657 00:04:17 -- event/cpu_locks.sh@99 -- # waitforlisten 2705844 /var/tmp/spdk.sock 00:05:52.657 00:04:17 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:52.657 00:04:17 -- common/autotest_common.sh@829 -- # '[' -z 2705844 ']' 00:05:52.657 00:04:17 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:52.657 00:04:17 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:52.657 00:04:17 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:52.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:52.657 00:04:17 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:52.657 00:04:17 -- common/autotest_common.sh@10 -- # set +x 00:05:52.657 [2024-11-30 00:04:18.014290] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:52.657 [2024-11-30 00:04:18.014381] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2705844 ] 00:05:52.657 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.657 [2024-11-30 00:04:18.077629] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:52.657 [2024-11-30 00:04:18.077661] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.657 [2024-11-30 00:04:18.151858] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:52.657 [2024-11-30 00:04:18.152007] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.345 00:04:18 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:53.345 00:04:18 -- common/autotest_common.sh@862 -- # return 0 00:05:53.345 00:04:18 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=2706051 00:05:53.345 00:04:18 -- event/cpu_locks.sh@103 -- # waitforlisten 2706051 /var/tmp/spdk2.sock 00:05:53.345 00:04:18 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:53.345 00:04:18 -- common/autotest_common.sh@829 -- # '[' -z 2706051 ']' 00:05:53.345 00:04:18 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:53.345 00:04:18 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:53.345 00:04:18 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:53.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:53.345 00:04:18 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:53.345 00:04:18 -- common/autotest_common.sh@10 -- # set +x 00:05:53.345 [2024-11-30 00:04:18.855859] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:53.345 [2024-11-30 00:04:18.855923] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2706051 ] 00:05:53.345 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.605 [2024-11-30 00:04:18.948928] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.605 [2024-11-30 00:04:19.088578] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:53.605 [2024-11-30 00:04:19.092734] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.173 00:04:19 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:54.173 00:04:19 -- common/autotest_common.sh@862 -- # return 0 00:05:54.173 00:04:19 -- event/cpu_locks.sh@105 -- # locks_exist 2706051 00:05:54.173 00:04:19 -- event/cpu_locks.sh@22 -- # lslocks -p 2706051 00:05:54.173 00:04:19 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:55.110 lslocks: write error 00:05:55.110 00:04:20 -- event/cpu_locks.sh@107 -- # killprocess 2705844 00:05:55.110 00:04:20 -- common/autotest_common.sh@936 -- # '[' -z 2705844 ']' 00:05:55.110 00:04:20 -- common/autotest_common.sh@940 -- # kill -0 2705844 00:05:55.110 00:04:20 -- common/autotest_common.sh@941 -- # uname 00:05:55.110 00:04:20 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:55.110 00:04:20 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2705844 00:05:55.110 00:04:20 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:55.110 00:04:20 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:55.110 00:04:20 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2705844' 00:05:55.110 killing process with pid 2705844 00:05:55.110 00:04:20 -- common/autotest_common.sh@955 -- # kill 2705844 00:05:55.110 00:04:20 -- common/autotest_common.sh@960 -- # wait 2705844 00:05:55.679 00:04:21 -- event/cpu_locks.sh@108 -- # killprocess 2706051 00:05:55.679 00:04:21 -- common/autotest_common.sh@936 -- # '[' -z 2706051 ']' 00:05:55.679 00:04:21 -- common/autotest_common.sh@940 -- # kill -0 2706051 00:05:55.679 00:04:21 -- common/autotest_common.sh@941 -- # uname 00:05:55.679 00:04:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:55.679 00:04:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2706051 00:05:55.679 00:04:21 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:55.679 00:04:21 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:55.679 00:04:21 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2706051' 00:05:55.679 killing process with pid 2706051 00:05:55.679 00:04:21 -- common/autotest_common.sh@955 -- # kill 2706051 00:05:55.679 00:04:21 -- common/autotest_common.sh@960 -- # wait 2706051 00:05:55.938 00:05:55.938 real 0m3.405s 00:05:55.938 user 0m3.665s 00:05:55.938 sys 0m1.071s 00:05:55.938 00:04:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.938 00:04:21 -- common/autotest_common.sh@10 -- # set +x 00:05:55.938 ************************************ 00:05:55.938 END TEST locking_app_on_unlocked_coremask 00:05:55.938 ************************************ 00:05:55.938 00:04:21 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:55.938 00:04:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:55.938 00:04:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:55.938 00:04:21 -- common/autotest_common.sh@10 -- # set +x 00:05:55.938 ************************************ 00:05:55.938 START TEST locking_app_on_locked_coremask 00:05:55.938 ************************************ 00:05:55.938 00:04:21 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:05:55.938 00:04:21 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=2706433 00:05:55.938 00:04:21 -- event/cpu_locks.sh@116 -- # waitforlisten 2706433 /var/tmp/spdk.sock 00:05:55.938 00:04:21 -- common/autotest_common.sh@829 -- # '[' -z 2706433 ']' 00:05:55.938 00:04:21 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:55.938 00:04:21 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:55.938 00:04:21 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:55.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:55.938 00:04:21 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:55.938 00:04:21 -- common/autotest_common.sh@10 -- # set +x 00:05:55.938 00:04:21 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:55.938 [2024-11-30 00:04:21.458006] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:55.938 [2024-11-30 00:04:21.458095] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2706433 ] 00:05:55.938 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.198 [2024-11-30 00:04:21.527055] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.198 [2024-11-30 00:04:21.603297] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:56.198 [2024-11-30 00:04:21.603443] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.796 00:04:22 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:56.796 00:04:22 -- common/autotest_common.sh@862 -- # return 0 00:05:56.796 00:04:22 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=2706697 00:05:56.796 00:04:22 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 2706697 /var/tmp/spdk2.sock 00:05:56.796 00:04:22 -- common/autotest_common.sh@650 -- # local es=0 00:05:56.796 00:04:22 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 2706697 /var/tmp/spdk2.sock 00:05:56.796 00:04:22 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:56.796 00:04:22 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:56.796 00:04:22 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.796 00:04:22 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:56.796 00:04:22 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:56.796 00:04:22 -- common/autotest_common.sh@653 -- # waitforlisten 2706697 /var/tmp/spdk2.sock 00:05:56.796 00:04:22 -- common/autotest_common.sh@829 -- # '[' -z 2706697 ']' 00:05:56.796 00:04:22 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:56.796 00:04:22 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:56.796 00:04:22 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:56.796 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:56.796 00:04:22 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:56.796 00:04:22 -- common/autotest_common.sh@10 -- # set +x 00:05:56.796 [2024-11-30 00:04:22.294808] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:56.796 [2024-11-30 00:04:22.294893] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2706697 ] 00:05:56.796 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.055 [2024-11-30 00:04:22.383446] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 2706433 has claimed it. 00:05:57.055 [2024-11-30 00:04:22.383481] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:57.623 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (2706697) - No such process 00:05:57.623 ERROR: process (pid: 2706697) is no longer running 00:05:57.623 00:04:22 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:57.623 00:04:22 -- common/autotest_common.sh@862 -- # return 1 00:05:57.623 00:04:22 -- common/autotest_common.sh@653 -- # es=1 00:05:57.623 00:04:22 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:57.623 00:04:22 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:57.623 00:04:22 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:57.623 00:04:22 -- event/cpu_locks.sh@122 -- # locks_exist 2706433 00:05:57.623 00:04:22 -- event/cpu_locks.sh@22 -- # lslocks -p 2706433 00:05:57.623 00:04:22 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:57.623 lslocks: write error 00:05:57.623 00:04:23 -- event/cpu_locks.sh@124 -- # killprocess 2706433 00:05:57.623 00:04:23 -- common/autotest_common.sh@936 -- # '[' -z 2706433 ']' 00:05:57.623 00:04:23 -- common/autotest_common.sh@940 -- # kill -0 2706433 00:05:57.624 00:04:23 -- common/autotest_common.sh@941 -- # uname 00:05:57.624 00:04:23 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:57.624 00:04:23 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2706433 00:05:57.884 00:04:23 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:57.884 00:04:23 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:57.884 00:04:23 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2706433' 00:05:57.884 killing process with pid 2706433 00:05:57.884 00:04:23 -- common/autotest_common.sh@955 -- # kill 2706433 00:05:57.884 00:04:23 -- common/autotest_common.sh@960 -- # wait 2706433 00:05:58.144 00:05:58.144 real 0m2.087s 00:05:58.144 user 0m2.287s 00:05:58.144 sys 0m0.583s 00:05:58.144 00:04:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:58.144 00:04:23 -- common/autotest_common.sh@10 -- # set +x 00:05:58.144 ************************************ 00:05:58.144 END TEST locking_app_on_locked_coremask 00:05:58.144 ************************************ 00:05:58.144 00:04:23 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:58.144 00:04:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:58.144 00:04:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:58.144 00:04:23 -- common/autotest_common.sh@10 -- # set +x 00:05:58.144 ************************************ 00:05:58.144 START TEST locking_overlapped_coremask 00:05:58.144 ************************************ 00:05:58.144 00:04:23 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:05:58.144 00:04:23 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=2706989 00:05:58.144 00:04:23 -- event/cpu_locks.sh@133 -- # waitforlisten 2706989 /var/tmp/spdk.sock 00:05:58.144 00:04:23 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:58.144 00:04:23 -- common/autotest_common.sh@829 -- # '[' -z 2706989 ']' 00:05:58.144 00:04:23 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:58.144 00:04:23 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:58.144 00:04:23 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:58.144 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:58.144 00:04:23 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:58.144 00:04:23 -- common/autotest_common.sh@10 -- # set +x 00:05:58.144 [2024-11-30 00:04:23.595751] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:58.144 [2024-11-30 00:04:23.595829] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2706989 ] 00:05:58.144 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.144 [2024-11-30 00:04:23.663531] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:58.405 [2024-11-30 00:04:23.740107] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:58.405 [2024-11-30 00:04:23.740276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:58.405 [2024-11-30 00:04:23.740370] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:58.405 [2024-11-30 00:04:23.740374] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.974 00:04:24 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:58.974 00:04:24 -- common/autotest_common.sh@862 -- # return 0 00:05:58.974 00:04:24 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=2707009 00:05:58.974 00:04:24 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 2707009 /var/tmp/spdk2.sock 00:05:58.974 00:04:24 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:58.974 00:04:24 -- common/autotest_common.sh@650 -- # local es=0 00:05:58.974 00:04:24 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 2707009 /var/tmp/spdk2.sock 00:05:58.974 00:04:24 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:58.974 00:04:24 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:58.974 00:04:24 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:58.974 00:04:24 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:58.974 00:04:24 -- common/autotest_common.sh@653 -- # waitforlisten 2707009 /var/tmp/spdk2.sock 00:05:58.974 00:04:24 -- common/autotest_common.sh@829 -- # '[' -z 2707009 ']' 00:05:58.974 00:04:24 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:58.974 00:04:24 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:58.974 00:04:24 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:58.974 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:58.974 00:04:24 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:58.974 00:04:24 -- common/autotest_common.sh@10 -- # set +x 00:05:58.974 [2024-11-30 00:04:24.462933] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:58.974 [2024-11-30 00:04:24.462997] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707009 ] 00:05:58.974 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.232 [2024-11-30 00:04:24.555530] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2706989 has claimed it. 00:05:59.232 [2024-11-30 00:04:24.555565] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:59.808 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (2707009) - No such process 00:05:59.808 ERROR: process (pid: 2707009) is no longer running 00:05:59.808 00:04:25 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:59.808 00:04:25 -- common/autotest_common.sh@862 -- # return 1 00:05:59.808 00:04:25 -- common/autotest_common.sh@653 -- # es=1 00:05:59.808 00:04:25 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:59.808 00:04:25 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:59.808 00:04:25 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:59.808 00:04:25 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:59.808 00:04:25 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:59.808 00:04:25 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:59.808 00:04:25 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:59.808 00:04:25 -- event/cpu_locks.sh@141 -- # killprocess 2706989 00:05:59.808 00:04:25 -- common/autotest_common.sh@936 -- # '[' -z 2706989 ']' 00:05:59.808 00:04:25 -- common/autotest_common.sh@940 -- # kill -0 2706989 00:05:59.808 00:04:25 -- common/autotest_common.sh@941 -- # uname 00:05:59.808 00:04:25 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:59.808 00:04:25 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2706989 00:05:59.808 00:04:25 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:59.808 00:04:25 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:59.808 00:04:25 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2706989' 00:05:59.808 killing process with pid 2706989 00:05:59.808 00:04:25 -- common/autotest_common.sh@955 -- # kill 2706989 00:05:59.808 00:04:25 -- common/autotest_common.sh@960 -- # wait 2706989 00:06:00.068 00:06:00.068 real 0m1.920s 00:06:00.068 user 0m5.442s 00:06:00.068 sys 0m0.459s 00:06:00.068 00:04:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:00.068 00:04:25 -- common/autotest_common.sh@10 -- # set +x 00:06:00.068 ************************************ 00:06:00.068 END TEST locking_overlapped_coremask 00:06:00.068 ************************************ 00:06:00.068 00:04:25 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:00.068 00:04:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:00.068 00:04:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.068 00:04:25 -- common/autotest_common.sh@10 -- # set +x 00:06:00.068 ************************************ 00:06:00.068 START TEST locking_overlapped_coremask_via_rpc 00:06:00.068 ************************************ 00:06:00.068 00:04:25 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:00.068 00:04:25 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=2707301 00:06:00.068 00:04:25 -- event/cpu_locks.sh@149 -- # waitforlisten 2707301 /var/tmp/spdk.sock 00:06:00.068 00:04:25 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:00.068 00:04:25 -- common/autotest_common.sh@829 -- # '[' -z 2707301 ']' 00:06:00.068 00:04:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:00.068 00:04:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:00.068 00:04:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:00.068 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:00.068 00:04:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:00.068 00:04:25 -- common/autotest_common.sh@10 -- # set +x 00:06:00.068 [2024-11-30 00:04:25.567380] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:00.068 [2024-11-30 00:04:25.567454] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707301 ] 00:06:00.068 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.327 [2024-11-30 00:04:25.634427] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:00.327 [2024-11-30 00:04:25.634461] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:00.327 [2024-11-30 00:04:25.700407] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:00.327 [2024-11-30 00:04:25.700626] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:00.327 [2024-11-30 00:04:25.700686] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:00.327 [2024-11-30 00:04:25.700688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.896 00:04:26 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:00.896 00:04:26 -- common/autotest_common.sh@862 -- # return 0 00:06:00.896 00:04:26 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=2707450 00:06:00.896 00:04:26 -- event/cpu_locks.sh@153 -- # waitforlisten 2707450 /var/tmp/spdk2.sock 00:06:00.896 00:04:26 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:00.896 00:04:26 -- common/autotest_common.sh@829 -- # '[' -z 2707450 ']' 00:06:00.896 00:04:26 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:00.896 00:04:26 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:00.896 00:04:26 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:00.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:00.896 00:04:26 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:00.896 00:04:26 -- common/autotest_common.sh@10 -- # set +x 00:06:00.896 [2024-11-30 00:04:26.422069] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:00.896 [2024-11-30 00:04:26.422149] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707450 ] 00:06:01.155 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.155 [2024-11-30 00:04:26.516329] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:01.155 [2024-11-30 00:04:26.516362] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:01.155 [2024-11-30 00:04:26.655352] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:01.155 [2024-11-30 00:04:26.655503] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:01.155 [2024-11-30 00:04:26.662646] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:01.155 [2024-11-30 00:04:26.662648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:01.722 00:04:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:01.722 00:04:27 -- common/autotest_common.sh@862 -- # return 0 00:06:01.722 00:04:27 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:01.722 00:04:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.722 00:04:27 -- common/autotest_common.sh@10 -- # set +x 00:06:01.722 00:04:27 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.722 00:04:27 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:01.722 00:04:27 -- common/autotest_common.sh@650 -- # local es=0 00:06:01.722 00:04:27 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:01.722 00:04:27 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:01.722 00:04:27 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:01.722 00:04:27 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:01.722 00:04:27 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:01.722 00:04:27 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:01.722 00:04:27 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:01.722 00:04:27 -- common/autotest_common.sh@10 -- # set +x 00:06:01.983 [2024-11-30 00:04:27.279658] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2707301 has claimed it. 00:06:01.983 request: 00:06:01.983 { 00:06:01.983 "method": "framework_enable_cpumask_locks", 00:06:01.983 "req_id": 1 00:06:01.983 } 00:06:01.983 Got JSON-RPC error response 00:06:01.983 response: 00:06:01.983 { 00:06:01.983 "code": -32603, 00:06:01.983 "message": "Failed to claim CPU core: 2" 00:06:01.983 } 00:06:01.983 00:04:27 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:01.983 00:04:27 -- common/autotest_common.sh@653 -- # es=1 00:06:01.983 00:04:27 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:01.983 00:04:27 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:01.983 00:04:27 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:01.983 00:04:27 -- event/cpu_locks.sh@158 -- # waitforlisten 2707301 /var/tmp/spdk.sock 00:06:01.983 00:04:27 -- common/autotest_common.sh@829 -- # '[' -z 2707301 ']' 00:06:01.983 00:04:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.983 00:04:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:01.983 00:04:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.983 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.983 00:04:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:01.983 00:04:27 -- common/autotest_common.sh@10 -- # set +x 00:06:01.983 00:04:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:01.983 00:04:27 -- common/autotest_common.sh@862 -- # return 0 00:06:01.983 00:04:27 -- event/cpu_locks.sh@159 -- # waitforlisten 2707450 /var/tmp/spdk2.sock 00:06:01.983 00:04:27 -- common/autotest_common.sh@829 -- # '[' -z 2707450 ']' 00:06:01.983 00:04:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:01.983 00:04:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:01.983 00:04:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:01.983 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:01.983 00:04:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:01.983 00:04:27 -- common/autotest_common.sh@10 -- # set +x 00:06:02.243 00:04:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.243 00:04:27 -- common/autotest_common.sh@862 -- # return 0 00:06:02.243 00:04:27 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:02.243 00:04:27 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:02.243 00:04:27 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:02.243 00:04:27 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:02.243 00:06:02.243 real 0m2.130s 00:06:02.243 user 0m0.879s 00:06:02.243 sys 0m0.181s 00:06:02.243 00:04:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.243 00:04:27 -- common/autotest_common.sh@10 -- # set +x 00:06:02.243 ************************************ 00:06:02.243 END TEST locking_overlapped_coremask_via_rpc 00:06:02.243 ************************************ 00:06:02.243 00:04:27 -- event/cpu_locks.sh@174 -- # cleanup 00:06:02.243 00:04:27 -- event/cpu_locks.sh@15 -- # [[ -z 2707301 ]] 00:06:02.243 00:04:27 -- event/cpu_locks.sh@15 -- # killprocess 2707301 00:06:02.243 00:04:27 -- common/autotest_common.sh@936 -- # '[' -z 2707301 ']' 00:06:02.243 00:04:27 -- common/autotest_common.sh@940 -- # kill -0 2707301 00:06:02.243 00:04:27 -- common/autotest_common.sh@941 -- # uname 00:06:02.243 00:04:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:02.243 00:04:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2707301 00:06:02.243 00:04:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:02.243 00:04:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:02.243 00:04:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2707301' 00:06:02.243 killing process with pid 2707301 00:06:02.243 00:04:27 -- common/autotest_common.sh@955 -- # kill 2707301 00:06:02.243 00:04:27 -- common/autotest_common.sh@960 -- # wait 2707301 00:06:02.811 00:04:28 -- event/cpu_locks.sh@16 -- # [[ -z 2707450 ]] 00:06:02.811 00:04:28 -- event/cpu_locks.sh@16 -- # killprocess 2707450 00:06:02.811 00:04:28 -- common/autotest_common.sh@936 -- # '[' -z 2707450 ']' 00:06:02.811 00:04:28 -- common/autotest_common.sh@940 -- # kill -0 2707450 00:06:02.811 00:04:28 -- common/autotest_common.sh@941 -- # uname 00:06:02.811 00:04:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:02.811 00:04:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2707450 00:06:02.811 00:04:28 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:02.811 00:04:28 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:02.811 00:04:28 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2707450' 00:06:02.811 killing process with pid 2707450 00:06:02.811 00:04:28 -- common/autotest_common.sh@955 -- # kill 2707450 00:06:02.811 00:04:28 -- common/autotest_common.sh@960 -- # wait 2707450 00:06:03.070 00:04:28 -- event/cpu_locks.sh@18 -- # rm -f 00:06:03.070 00:04:28 -- event/cpu_locks.sh@1 -- # cleanup 00:06:03.070 00:04:28 -- event/cpu_locks.sh@15 -- # [[ -z 2707301 ]] 00:06:03.070 00:04:28 -- event/cpu_locks.sh@15 -- # killprocess 2707301 00:06:03.070 00:04:28 -- common/autotest_common.sh@936 -- # '[' -z 2707301 ']' 00:06:03.070 00:04:28 -- common/autotest_common.sh@940 -- # kill -0 2707301 00:06:03.070 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (2707301) - No such process 00:06:03.070 00:04:28 -- common/autotest_common.sh@963 -- # echo 'Process with pid 2707301 is not found' 00:06:03.070 Process with pid 2707301 is not found 00:06:03.070 00:04:28 -- event/cpu_locks.sh@16 -- # [[ -z 2707450 ]] 00:06:03.070 00:04:28 -- event/cpu_locks.sh@16 -- # killprocess 2707450 00:06:03.070 00:04:28 -- common/autotest_common.sh@936 -- # '[' -z 2707450 ']' 00:06:03.070 00:04:28 -- common/autotest_common.sh@940 -- # kill -0 2707450 00:06:03.070 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (2707450) - No such process 00:06:03.070 00:04:28 -- common/autotest_common.sh@963 -- # echo 'Process with pid 2707450 is not found' 00:06:03.070 Process with pid 2707450 is not found 00:06:03.070 00:04:28 -- event/cpu_locks.sh@18 -- # rm -f 00:06:03.070 00:06:03.070 real 0m18.230s 00:06:03.070 user 0m31.100s 00:06:03.070 sys 0m5.783s 00:06:03.070 00:04:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.070 00:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:03.070 ************************************ 00:06:03.070 END TEST cpu_locks 00:06:03.070 ************************************ 00:06:03.070 00:06:03.070 real 0m43.997s 00:06:03.070 user 1m23.419s 00:06:03.070 sys 0m9.887s 00:06:03.070 00:04:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:03.070 00:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:03.070 ************************************ 00:06:03.070 END TEST event 00:06:03.070 ************************************ 00:06:03.070 00:04:28 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:03.070 00:04:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:03.070 00:04:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.070 00:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:03.070 ************************************ 00:06:03.070 START TEST thread 00:06:03.070 ************************************ 00:06:03.070 00:04:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:03.331 * Looking for test storage... 00:06:03.331 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:03.331 00:04:28 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:03.331 00:04:28 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:03.331 00:04:28 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:03.331 00:04:28 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:03.331 00:04:28 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:03.331 00:04:28 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:03.331 00:04:28 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:03.331 00:04:28 -- scripts/common.sh@335 -- # IFS=.-: 00:06:03.331 00:04:28 -- scripts/common.sh@335 -- # read -ra ver1 00:06:03.331 00:04:28 -- scripts/common.sh@336 -- # IFS=.-: 00:06:03.331 00:04:28 -- scripts/common.sh@336 -- # read -ra ver2 00:06:03.331 00:04:28 -- scripts/common.sh@337 -- # local 'op=<' 00:06:03.331 00:04:28 -- scripts/common.sh@339 -- # ver1_l=2 00:06:03.331 00:04:28 -- scripts/common.sh@340 -- # ver2_l=1 00:06:03.331 00:04:28 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:03.331 00:04:28 -- scripts/common.sh@343 -- # case "$op" in 00:06:03.331 00:04:28 -- scripts/common.sh@344 -- # : 1 00:06:03.331 00:04:28 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:03.331 00:04:28 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:03.331 00:04:28 -- scripts/common.sh@364 -- # decimal 1 00:06:03.331 00:04:28 -- scripts/common.sh@352 -- # local d=1 00:06:03.331 00:04:28 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:03.331 00:04:28 -- scripts/common.sh@354 -- # echo 1 00:06:03.331 00:04:28 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:03.331 00:04:28 -- scripts/common.sh@365 -- # decimal 2 00:06:03.331 00:04:28 -- scripts/common.sh@352 -- # local d=2 00:06:03.331 00:04:28 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:03.331 00:04:28 -- scripts/common.sh@354 -- # echo 2 00:06:03.331 00:04:28 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:03.331 00:04:28 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:03.331 00:04:28 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:03.331 00:04:28 -- scripts/common.sh@367 -- # return 0 00:06:03.331 00:04:28 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:03.331 00:04:28 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:03.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.331 --rc genhtml_branch_coverage=1 00:06:03.331 --rc genhtml_function_coverage=1 00:06:03.331 --rc genhtml_legend=1 00:06:03.331 --rc geninfo_all_blocks=1 00:06:03.331 --rc geninfo_unexecuted_blocks=1 00:06:03.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.331 ' 00:06:03.331 00:04:28 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:03.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.331 --rc genhtml_branch_coverage=1 00:06:03.331 --rc genhtml_function_coverage=1 00:06:03.331 --rc genhtml_legend=1 00:06:03.331 --rc geninfo_all_blocks=1 00:06:03.331 --rc geninfo_unexecuted_blocks=1 00:06:03.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.331 ' 00:06:03.331 00:04:28 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:03.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.331 --rc genhtml_branch_coverage=1 00:06:03.331 --rc genhtml_function_coverage=1 00:06:03.331 --rc genhtml_legend=1 00:06:03.331 --rc geninfo_all_blocks=1 00:06:03.331 --rc geninfo_unexecuted_blocks=1 00:06:03.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.331 ' 00:06:03.331 00:04:28 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:03.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:03.331 --rc genhtml_branch_coverage=1 00:06:03.331 --rc genhtml_function_coverage=1 00:06:03.331 --rc genhtml_legend=1 00:06:03.331 --rc geninfo_all_blocks=1 00:06:03.331 --rc geninfo_unexecuted_blocks=1 00:06:03.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:03.331 ' 00:06:03.331 00:04:28 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:03.331 00:04:28 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:03.331 00:04:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:03.331 00:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:03.331 ************************************ 00:06:03.331 START TEST thread_poller_perf 00:06:03.331 ************************************ 00:06:03.331 00:04:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:03.331 [2024-11-30 00:04:28.780879] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:03.331 [2024-11-30 00:04:28.780969] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707954 ] 00:06:03.331 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.331 [2024-11-30 00:04:28.851229] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.591 [2024-11-30 00:04:28.921499] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.591 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:04.530 [2024-11-29T23:04:30.086Z] ====================================== 00:06:04.530 [2024-11-29T23:04:30.086Z] busy:2506072848 (cyc) 00:06:04.530 [2024-11-29T23:04:30.086Z] total_run_count: 785000 00:06:04.530 [2024-11-29T23:04:30.086Z] tsc_hz: 2500000000 (cyc) 00:06:04.530 [2024-11-29T23:04:30.086Z] ====================================== 00:06:04.530 [2024-11-29T23:04:30.086Z] poller_cost: 3192 (cyc), 1276 (nsec) 00:06:04.530 00:06:04.530 real 0m1.227s 00:06:04.530 user 0m1.138s 00:06:04.530 sys 0m0.084s 00:06:04.530 00:04:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:04.530 00:04:29 -- common/autotest_common.sh@10 -- # set +x 00:06:04.530 ************************************ 00:06:04.530 END TEST thread_poller_perf 00:06:04.530 ************************************ 00:06:04.530 00:04:30 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:04.530 00:04:30 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:04.530 00:04:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:04.530 00:04:30 -- common/autotest_common.sh@10 -- # set +x 00:06:04.530 ************************************ 00:06:04.530 START TEST thread_poller_perf 00:06:04.530 ************************************ 00:06:04.530 00:04:30 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:04.530 [2024-11-30 00:04:30.057572] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:04.530 [2024-11-30 00:04:30.057679] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708244 ] 00:06:04.789 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.789 [2024-11-30 00:04:30.129038] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.789 [2024-11-30 00:04:30.201564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.789 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:05.746 [2024-11-29T23:04:31.302Z] ====================================== 00:06:05.746 [2024-11-29T23:04:31.302Z] busy:2502153700 (cyc) 00:06:05.746 [2024-11-29T23:04:31.302Z] total_run_count: 12725000 00:06:05.746 [2024-11-29T23:04:31.302Z] tsc_hz: 2500000000 (cyc) 00:06:05.746 [2024-11-29T23:04:31.302Z] ====================================== 00:06:05.746 [2024-11-29T23:04:31.302Z] poller_cost: 196 (cyc), 78 (nsec) 00:06:05.746 00:06:05.746 real 0m1.228s 00:06:05.746 user 0m1.143s 00:06:05.746 sys 0m0.081s 00:06:05.746 00:04:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:05.746 00:04:31 -- common/autotest_common.sh@10 -- # set +x 00:06:05.746 ************************************ 00:06:05.746 END TEST thread_poller_perf 00:06:05.746 ************************************ 00:06:06.006 00:04:31 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:06.006 00:04:31 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:06.006 00:04:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:06.006 00:04:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.006 00:04:31 -- common/autotest_common.sh@10 -- # set +x 00:06:06.006 ************************************ 00:06:06.006 START TEST thread_spdk_lock 00:06:06.006 ************************************ 00:06:06.006 00:04:31 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:06.006 [2024-11-30 00:04:31.338517] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:06.006 [2024-11-30 00:04:31.338618] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708526 ] 00:06:06.006 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.006 [2024-11-30 00:04:31.409581] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:06.006 [2024-11-30 00:04:31.477386] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.006 [2024-11-30 00:04:31.477389] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.575 [2024-11-30 00:04:31.958583] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:06.575 [2024-11-30 00:04:31.958627] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:06.575 [2024-11-30 00:04:31.958637] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1483c80 00:06:06.575 [2024-11-30 00:04:31.959531] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:06.575 [2024-11-30 00:04:31.959635] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:06.575 [2024-11-30 00:04:31.959654] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:06.575 Starting test contend 00:06:06.575 Worker Delay Wait us Hold us Total us 00:06:06.575 0 3 167837 182368 350206 00:06:06.575 1 5 80825 283711 364537 00:06:06.575 PASS test contend 00:06:06.575 Starting test hold_by_poller 00:06:06.575 PASS test hold_by_poller 00:06:06.575 Starting test hold_by_message 00:06:06.575 PASS test hold_by_message 00:06:06.575 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:06.575 100014 assertions passed 00:06:06.575 0 assertions failed 00:06:06.575 00:06:06.575 real 0m0.702s 00:06:06.575 user 0m1.091s 00:06:06.575 sys 0m0.090s 00:06:06.575 00:04:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:06.575 00:04:32 -- common/autotest_common.sh@10 -- # set +x 00:06:06.575 ************************************ 00:06:06.575 END TEST thread_spdk_lock 00:06:06.575 ************************************ 00:06:06.575 00:06:06.575 real 0m3.489s 00:06:06.575 user 0m3.532s 00:06:06.575 sys 0m0.468s 00:06:06.575 00:04:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:06.575 00:04:32 -- common/autotest_common.sh@10 -- # set +x 00:06:06.575 ************************************ 00:06:06.575 END TEST thread 00:06:06.575 ************************************ 00:06:06.575 00:04:32 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:06.575 00:04:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:06.575 00:04:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.575 00:04:32 -- common/autotest_common.sh@10 -- # set +x 00:06:06.575 ************************************ 00:06:06.575 START TEST accel 00:06:06.575 ************************************ 00:06:06.575 00:04:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:06.835 * Looking for test storage... 00:06:06.835 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:06.835 00:04:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:06.835 00:04:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:06.835 00:04:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:06.835 00:04:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:06.835 00:04:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:06.835 00:04:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:06.835 00:04:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:06.835 00:04:32 -- scripts/common.sh@335 -- # IFS=.-: 00:06:06.835 00:04:32 -- scripts/common.sh@335 -- # read -ra ver1 00:06:06.835 00:04:32 -- scripts/common.sh@336 -- # IFS=.-: 00:06:06.835 00:04:32 -- scripts/common.sh@336 -- # read -ra ver2 00:06:06.835 00:04:32 -- scripts/common.sh@337 -- # local 'op=<' 00:06:06.835 00:04:32 -- scripts/common.sh@339 -- # ver1_l=2 00:06:06.835 00:04:32 -- scripts/common.sh@340 -- # ver2_l=1 00:06:06.835 00:04:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:06.835 00:04:32 -- scripts/common.sh@343 -- # case "$op" in 00:06:06.835 00:04:32 -- scripts/common.sh@344 -- # : 1 00:06:06.835 00:04:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:06.835 00:04:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:06.835 00:04:32 -- scripts/common.sh@364 -- # decimal 1 00:06:06.835 00:04:32 -- scripts/common.sh@352 -- # local d=1 00:06:06.835 00:04:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:06.835 00:04:32 -- scripts/common.sh@354 -- # echo 1 00:06:06.835 00:04:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:06.835 00:04:32 -- scripts/common.sh@365 -- # decimal 2 00:06:06.835 00:04:32 -- scripts/common.sh@352 -- # local d=2 00:06:06.835 00:04:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:06.835 00:04:32 -- scripts/common.sh@354 -- # echo 2 00:06:06.835 00:04:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:06.836 00:04:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:06.836 00:04:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:06.836 00:04:32 -- scripts/common.sh@367 -- # return 0 00:06:06.836 00:04:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:06.836 00:04:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:06.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.836 --rc genhtml_branch_coverage=1 00:06:06.836 --rc genhtml_function_coverage=1 00:06:06.836 --rc genhtml_legend=1 00:06:06.836 --rc geninfo_all_blocks=1 00:06:06.836 --rc geninfo_unexecuted_blocks=1 00:06:06.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.836 ' 00:06:06.836 00:04:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:06.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.836 --rc genhtml_branch_coverage=1 00:06:06.836 --rc genhtml_function_coverage=1 00:06:06.836 --rc genhtml_legend=1 00:06:06.836 --rc geninfo_all_blocks=1 00:06:06.836 --rc geninfo_unexecuted_blocks=1 00:06:06.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.836 ' 00:06:06.836 00:04:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:06.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.836 --rc genhtml_branch_coverage=1 00:06:06.836 --rc genhtml_function_coverage=1 00:06:06.836 --rc genhtml_legend=1 00:06:06.836 --rc geninfo_all_blocks=1 00:06:06.836 --rc geninfo_unexecuted_blocks=1 00:06:06.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.836 ' 00:06:06.836 00:04:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:06.836 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.836 --rc genhtml_branch_coverage=1 00:06:06.836 --rc genhtml_function_coverage=1 00:06:06.836 --rc genhtml_legend=1 00:06:06.836 --rc geninfo_all_blocks=1 00:06:06.836 --rc geninfo_unexecuted_blocks=1 00:06:06.836 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:06.836 ' 00:06:06.836 00:04:32 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:06.836 00:04:32 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:06.836 00:04:32 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:06.836 00:04:32 -- accel/accel.sh@59 -- # spdk_tgt_pid=2708632 00:06:06.836 00:04:32 -- accel/accel.sh@60 -- # waitforlisten 2708632 00:06:06.836 00:04:32 -- common/autotest_common.sh@829 -- # '[' -z 2708632 ']' 00:06:06.836 00:04:32 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:06.836 00:04:32 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:06.836 00:04:32 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:06.836 00:04:32 -- accel/accel.sh@58 -- # build_accel_config 00:06:06.836 00:04:32 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:06.836 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:06.836 00:04:32 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:06.836 00:04:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:06.836 00:04:32 -- common/autotest_common.sh@10 -- # set +x 00:06:06.836 00:04:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.836 00:04:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.836 00:04:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:06.836 00:04:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:06.836 00:04:32 -- accel/accel.sh@41 -- # local IFS=, 00:06:06.836 00:04:32 -- accel/accel.sh@42 -- # jq -r . 00:06:06.836 [2024-11-30 00:04:32.310498] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:06.836 [2024-11-30 00:04:32.310567] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708632 ] 00:06:06.836 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.836 [2024-11-30 00:04:32.378166] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.095 [2024-11-30 00:04:32.454616] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:07.095 [2024-11-30 00:04:32.454727] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.684 00:04:33 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:07.684 00:04:33 -- common/autotest_common.sh@862 -- # return 0 00:06:07.684 00:04:33 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:07.684 00:04:33 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:07.684 00:04:33 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.684 00:04:33 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:07.684 00:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:07.684 00:04:33 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.684 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.684 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.684 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # IFS== 00:06:07.685 00:04:33 -- accel/accel.sh@64 -- # read -r opc module 00:06:07.685 00:04:33 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:07.685 00:04:33 -- accel/accel.sh@67 -- # killprocess 2708632 00:06:07.685 00:04:33 -- common/autotest_common.sh@936 -- # '[' -z 2708632 ']' 00:06:07.685 00:04:33 -- common/autotest_common.sh@940 -- # kill -0 2708632 00:06:07.685 00:04:33 -- common/autotest_common.sh@941 -- # uname 00:06:07.685 00:04:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:07.685 00:04:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2708632 00:06:07.945 00:04:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:07.945 00:04:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:07.945 00:04:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2708632' 00:06:07.945 killing process with pid 2708632 00:06:07.945 00:04:33 -- common/autotest_common.sh@955 -- # kill 2708632 00:06:07.945 00:04:33 -- common/autotest_common.sh@960 -- # wait 2708632 00:06:08.203 00:04:33 -- accel/accel.sh@68 -- # trap - ERR 00:06:08.203 00:04:33 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:08.203 00:04:33 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:08.203 00:04:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.203 00:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:08.203 00:04:33 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:08.203 00:04:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:08.203 00:04:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.203 00:04:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.203 00:04:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.203 00:04:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.203 00:04:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.203 00:04:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.203 00:04:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.203 00:04:33 -- accel/accel.sh@42 -- # jq -r . 00:06:08.203 00:04:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.203 00:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:08.203 00:04:33 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:08.203 00:04:33 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:08.203 00:04:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.203 00:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:08.203 ************************************ 00:06:08.203 START TEST accel_missing_filename 00:06:08.203 ************************************ 00:06:08.203 00:04:33 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:08.203 00:04:33 -- common/autotest_common.sh@650 -- # local es=0 00:06:08.203 00:04:33 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:08.203 00:04:33 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:08.203 00:04:33 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.203 00:04:33 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:08.203 00:04:33 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.203 00:04:33 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:08.203 00:04:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:08.203 00:04:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.203 00:04:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.203 00:04:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.203 00:04:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.203 00:04:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.203 00:04:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.203 00:04:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.203 00:04:33 -- accel/accel.sh@42 -- # jq -r . 00:06:08.203 [2024-11-30 00:04:33.648681] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:08.203 [2024-11-30 00:04:33.648773] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708908 ] 00:06:08.203 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.203 [2024-11-30 00:04:33.719890] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.461 [2024-11-30 00:04:33.786050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.461 [2024-11-30 00:04:33.825979] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:08.461 [2024-11-30 00:04:33.886370] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:08.461 A filename is required. 00:06:08.461 00:04:33 -- common/autotest_common.sh@653 -- # es=234 00:06:08.461 00:04:33 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:08.461 00:04:33 -- common/autotest_common.sh@662 -- # es=106 00:06:08.461 00:04:33 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:08.461 00:04:33 -- common/autotest_common.sh@670 -- # es=1 00:06:08.461 00:04:33 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:08.461 00:06:08.461 real 0m0.331s 00:06:08.461 user 0m0.239s 00:06:08.461 sys 0m0.130s 00:06:08.461 00:04:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.461 00:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:08.461 ************************************ 00:06:08.461 END TEST accel_missing_filename 00:06:08.461 ************************************ 00:06:08.461 00:04:33 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:08.461 00:04:33 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:08.461 00:04:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.461 00:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:08.461 ************************************ 00:06:08.461 START TEST accel_compress_verify 00:06:08.461 ************************************ 00:06:08.461 00:04:34 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:08.461 00:04:34 -- common/autotest_common.sh@650 -- # local es=0 00:06:08.461 00:04:34 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:08.461 00:04:34 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:08.461 00:04:34 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.461 00:04:34 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:08.461 00:04:34 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.461 00:04:34 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:08.461 00:04:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:08.461 00:04:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.461 00:04:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.461 00:04:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.461 00:04:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.461 00:04:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.461 00:04:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.461 00:04:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.461 00:04:34 -- accel/accel.sh@42 -- # jq -r . 00:06:08.719 [2024-11-30 00:04:34.021838] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:08.719 [2024-11-30 00:04:34.021907] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709057 ] 00:06:08.719 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.719 [2024-11-30 00:04:34.089641] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.719 [2024-11-30 00:04:34.158126] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.719 [2024-11-30 00:04:34.198075] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:08.719 [2024-11-30 00:04:34.258327] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:08.979 00:06:08.979 Compression does not support the verify option, aborting. 00:06:08.979 00:04:34 -- common/autotest_common.sh@653 -- # es=161 00:06:08.979 00:04:34 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:08.979 00:04:34 -- common/autotest_common.sh@662 -- # es=33 00:06:08.979 00:04:34 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:08.979 00:04:34 -- common/autotest_common.sh@670 -- # es=1 00:06:08.979 00:04:34 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:08.979 00:06:08.979 real 0m0.323s 00:06:08.979 user 0m0.242s 00:06:08.979 sys 0m0.118s 00:06:08.979 00:04:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.979 00:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:08.979 ************************************ 00:06:08.979 END TEST accel_compress_verify 00:06:08.979 ************************************ 00:06:08.979 00:04:34 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:08.979 00:04:34 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:08.979 00:04:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.979 00:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:08.979 ************************************ 00:06:08.979 START TEST accel_wrong_workload 00:06:08.979 ************************************ 00:06:08.979 00:04:34 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:08.979 00:04:34 -- common/autotest_common.sh@650 -- # local es=0 00:06:08.979 00:04:34 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:08.979 00:04:34 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:08.979 00:04:34 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.979 00:04:34 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:08.979 00:04:34 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.979 00:04:34 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:08.979 00:04:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:08.979 00:04:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.979 00:04:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.979 00:04:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.979 00:04:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.979 00:04:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.979 00:04:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.979 00:04:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.979 00:04:34 -- accel/accel.sh@42 -- # jq -r . 00:06:08.979 Unsupported workload type: foobar 00:06:08.979 [2024-11-30 00:04:34.397174] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:08.979 accel_perf options: 00:06:08.979 [-h help message] 00:06:08.979 [-q queue depth per core] 00:06:08.979 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:08.979 [-T number of threads per core 00:06:08.979 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:08.979 [-t time in seconds] 00:06:08.979 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:08.979 [ dif_verify, , dif_generate, dif_generate_copy 00:06:08.979 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:08.979 [-l for compress/decompress workloads, name of uncompressed input file 00:06:08.979 [-S for crc32c workload, use this seed value (default 0) 00:06:08.979 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:08.979 [-f for fill workload, use this BYTE value (default 255) 00:06:08.979 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:08.979 [-y verify result if this switch is on] 00:06:08.979 [-a tasks to allocate per core (default: same value as -q)] 00:06:08.979 Can be used to spread operations across a wider range of memory. 00:06:08.979 00:04:34 -- common/autotest_common.sh@653 -- # es=1 00:06:08.979 00:04:34 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:08.979 00:04:34 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:08.979 00:04:34 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:08.979 00:06:08.979 real 0m0.029s 00:06:08.979 user 0m0.012s 00:06:08.979 sys 0m0.018s 00:06:08.979 00:04:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.979 00:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:08.979 ************************************ 00:06:08.979 END TEST accel_wrong_workload 00:06:08.979 ************************************ 00:06:08.979 Error: writing output failed: Broken pipe 00:06:08.979 00:04:34 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:08.979 00:04:34 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:08.979 00:04:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.979 00:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:08.979 ************************************ 00:06:08.979 START TEST accel_negative_buffers 00:06:08.979 ************************************ 00:06:08.979 00:04:34 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:08.979 00:04:34 -- common/autotest_common.sh@650 -- # local es=0 00:06:08.979 00:04:34 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:08.979 00:04:34 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:08.979 00:04:34 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.979 00:04:34 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:08.979 00:04:34 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:08.979 00:04:34 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:08.979 00:04:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:08.979 00:04:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.979 00:04:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.979 00:04:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.979 00:04:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.979 00:04:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.979 00:04:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.979 00:04:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.979 00:04:34 -- accel/accel.sh@42 -- # jq -r . 00:06:08.979 -x option must be non-negative. 00:06:08.979 [2024-11-30 00:04:34.474331] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:08.979 accel_perf options: 00:06:08.979 [-h help message] 00:06:08.979 [-q queue depth per core] 00:06:08.979 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:08.979 [-T number of threads per core 00:06:08.979 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:08.979 [-t time in seconds] 00:06:08.980 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:08.980 [ dif_verify, , dif_generate, dif_generate_copy 00:06:08.980 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:08.980 [-l for compress/decompress workloads, name of uncompressed input file 00:06:08.980 [-S for crc32c workload, use this seed value (default 0) 00:06:08.980 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:08.980 [-f for fill workload, use this BYTE value (default 255) 00:06:08.980 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:08.980 [-y verify result if this switch is on] 00:06:08.980 [-a tasks to allocate per core (default: same value as -q)] 00:06:08.980 Can be used to spread operations across a wider range of memory. 00:06:08.980 00:04:34 -- common/autotest_common.sh@653 -- # es=1 00:06:08.980 00:04:34 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:08.980 00:04:34 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:08.980 00:04:34 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:08.980 00:06:08.980 real 0m0.030s 00:06:08.980 user 0m0.013s 00:06:08.980 sys 0m0.017s 00:06:08.980 00:04:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.980 00:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:08.980 ************************************ 00:06:08.980 END TEST accel_negative_buffers 00:06:08.980 ************************************ 00:06:08.980 Error: writing output failed: Broken pipe 00:06:08.980 00:04:34 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:08.980 00:04:34 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:08.980 00:04:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.980 00:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:08.980 ************************************ 00:06:08.980 START TEST accel_crc32c 00:06:08.980 ************************************ 00:06:08.980 00:04:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:08.980 00:04:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:08.980 00:04:34 -- accel/accel.sh@17 -- # local accel_module 00:06:08.980 00:04:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:09.239 00:04:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:09.239 00:04:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:09.239 00:04:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:09.239 00:04:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.239 00:04:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.239 00:04:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:09.239 00:04:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:09.239 00:04:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:09.239 00:04:34 -- accel/accel.sh@42 -- # jq -r . 00:06:09.239 [2024-11-30 00:04:34.553301] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:09.239 [2024-11-30 00:04:34.553394] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709238 ] 00:06:09.239 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.239 [2024-11-30 00:04:34.624489] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.239 [2024-11-30 00:04:34.699865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.613 00:04:35 -- accel/accel.sh@18 -- # out=' 00:06:10.613 SPDK Configuration: 00:06:10.613 Core mask: 0x1 00:06:10.613 00:06:10.613 Accel Perf Configuration: 00:06:10.613 Workload Type: crc32c 00:06:10.613 CRC-32C seed: 32 00:06:10.613 Transfer size: 4096 bytes 00:06:10.613 Vector count 1 00:06:10.613 Module: software 00:06:10.613 Queue depth: 32 00:06:10.613 Allocate depth: 32 00:06:10.613 # threads/core: 1 00:06:10.613 Run time: 1 seconds 00:06:10.613 Verify: Yes 00:06:10.613 00:06:10.613 Running for 1 seconds... 00:06:10.613 00:06:10.613 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:10.613 ------------------------------------------------------------------------------------ 00:06:10.613 0,0 819360/s 3200 MiB/s 0 0 00:06:10.613 ==================================================================================== 00:06:10.613 Total 819360/s 3200 MiB/s 0 0' 00:06:10.613 00:04:35 -- accel/accel.sh@20 -- # IFS=: 00:06:10.613 00:04:35 -- accel/accel.sh@20 -- # read -r var val 00:06:10.613 00:04:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:10.613 00:04:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:10.613 00:04:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.613 00:04:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.613 00:04:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.613 00:04:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.613 00:04:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.613 00:04:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.613 00:04:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.613 00:04:35 -- accel/accel.sh@42 -- # jq -r . 00:06:10.613 [2024-11-30 00:04:35.893117] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:10.613 [2024-11-30 00:04:35.893214] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709510 ] 00:06:10.613 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.613 [2024-11-30 00:04:35.964044] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.613 [2024-11-30 00:04:36.031109] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.613 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.613 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.613 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.613 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.613 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.613 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.613 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.613 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=0x1 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=crc32c 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=32 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=software 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@23 -- # accel_module=software 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=32 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=32 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=1 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val=Yes 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:10.614 00:04:36 -- accel/accel.sh@21 -- # val= 00:06:10.614 00:04:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # IFS=: 00:06:10.614 00:04:36 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@21 -- # val= 00:06:11.990 00:04:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # IFS=: 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@21 -- # val= 00:06:11.990 00:04:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # IFS=: 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@21 -- # val= 00:06:11.990 00:04:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # IFS=: 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@21 -- # val= 00:06:11.990 00:04:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # IFS=: 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@21 -- # val= 00:06:11.990 00:04:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # IFS=: 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@21 -- # val= 00:06:11.990 00:04:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # IFS=: 00:06:11.990 00:04:37 -- accel/accel.sh@20 -- # read -r var val 00:06:11.990 00:04:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:11.990 00:04:37 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:11.990 00:04:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:11.990 00:06:11.990 real 0m2.676s 00:06:11.990 user 0m2.422s 00:06:11.990 sys 0m0.261s 00:06:11.990 00:04:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:11.990 00:04:37 -- common/autotest_common.sh@10 -- # set +x 00:06:11.990 ************************************ 00:06:11.990 END TEST accel_crc32c 00:06:11.990 ************************************ 00:06:11.990 00:04:37 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:11.990 00:04:37 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:11.990 00:04:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:11.990 00:04:37 -- common/autotest_common.sh@10 -- # set +x 00:06:11.990 ************************************ 00:06:11.990 START TEST accel_crc32c_C2 00:06:11.990 ************************************ 00:06:11.990 00:04:37 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:11.990 00:04:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:11.990 00:04:37 -- accel/accel.sh@17 -- # local accel_module 00:06:11.990 00:04:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:11.990 00:04:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:11.990 00:04:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.990 00:04:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.990 00:04:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.990 00:04:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.990 00:04:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.990 00:04:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.990 00:04:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.990 00:04:37 -- accel/accel.sh@42 -- # jq -r . 00:06:11.990 [2024-11-30 00:04:37.278042] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:11.990 [2024-11-30 00:04:37.278131] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709745 ] 00:06:11.990 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.990 [2024-11-30 00:04:37.347576] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.990 [2024-11-30 00:04:37.416215] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.369 00:04:38 -- accel/accel.sh@18 -- # out=' 00:06:13.369 SPDK Configuration: 00:06:13.369 Core mask: 0x1 00:06:13.369 00:06:13.369 Accel Perf Configuration: 00:06:13.369 Workload Type: crc32c 00:06:13.369 CRC-32C seed: 0 00:06:13.369 Transfer size: 4096 bytes 00:06:13.369 Vector count 2 00:06:13.369 Module: software 00:06:13.369 Queue depth: 32 00:06:13.369 Allocate depth: 32 00:06:13.369 # threads/core: 1 00:06:13.369 Run time: 1 seconds 00:06:13.369 Verify: Yes 00:06:13.369 00:06:13.369 Running for 1 seconds... 00:06:13.369 00:06:13.369 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:13.369 ------------------------------------------------------------------------------------ 00:06:13.369 0,0 618240/s 4830 MiB/s 0 0 00:06:13.369 ==================================================================================== 00:06:13.369 Total 618240/s 2415 MiB/s 0 0' 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:13.369 00:04:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:13.369 00:04:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.369 00:04:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:13.369 00:04:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.369 00:04:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.369 00:04:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:13.369 00:04:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:13.369 00:04:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:13.369 00:04:38 -- accel/accel.sh@42 -- # jq -r . 00:06:13.369 [2024-11-30 00:04:38.607076] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:13.369 [2024-11-30 00:04:38.607168] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709924 ] 00:06:13.369 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.369 [2024-11-30 00:04:38.677185] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.369 [2024-11-30 00:04:38.745627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=0x1 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=crc32c 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=0 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=software 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@23 -- # accel_module=software 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=32 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=32 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=1 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val=Yes 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:13.369 00:04:38 -- accel/accel.sh@21 -- # val= 00:06:13.369 00:04:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # IFS=: 00:06:13.369 00:04:38 -- accel/accel.sh@20 -- # read -r var val 00:06:14.774 00:04:39 -- accel/accel.sh@21 -- # val= 00:06:14.774 00:04:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # IFS=: 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # read -r var val 00:06:14.774 00:04:39 -- accel/accel.sh@21 -- # val= 00:06:14.774 00:04:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # IFS=: 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # read -r var val 00:06:14.774 00:04:39 -- accel/accel.sh@21 -- # val= 00:06:14.774 00:04:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # IFS=: 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # read -r var val 00:06:14.774 00:04:39 -- accel/accel.sh@21 -- # val= 00:06:14.774 00:04:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.774 00:04:39 -- accel/accel.sh@20 -- # IFS=: 00:06:14.775 00:04:39 -- accel/accel.sh@20 -- # read -r var val 00:06:14.775 00:04:39 -- accel/accel.sh@21 -- # val= 00:06:14.775 00:04:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.775 00:04:39 -- accel/accel.sh@20 -- # IFS=: 00:06:14.775 00:04:39 -- accel/accel.sh@20 -- # read -r var val 00:06:14.775 00:04:39 -- accel/accel.sh@21 -- # val= 00:06:14.775 00:04:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.775 00:04:39 -- accel/accel.sh@20 -- # IFS=: 00:06:14.775 00:04:39 -- accel/accel.sh@20 -- # read -r var val 00:06:14.775 00:04:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:14.775 00:04:39 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:14.775 00:04:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:14.775 00:06:14.775 real 0m2.664s 00:06:14.775 user 0m2.424s 00:06:14.775 sys 0m0.251s 00:06:14.775 00:04:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.775 00:04:39 -- common/autotest_common.sh@10 -- # set +x 00:06:14.775 ************************************ 00:06:14.775 END TEST accel_crc32c_C2 00:06:14.775 ************************************ 00:06:14.775 00:04:39 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:14.775 00:04:39 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:14.775 00:04:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.775 00:04:39 -- common/autotest_common.sh@10 -- # set +x 00:06:14.775 ************************************ 00:06:14.775 START TEST accel_copy 00:06:14.775 ************************************ 00:06:14.775 00:04:39 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:14.775 00:04:39 -- accel/accel.sh@16 -- # local accel_opc 00:06:14.775 00:04:39 -- accel/accel.sh@17 -- # local accel_module 00:06:14.775 00:04:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:14.775 00:04:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:14.775 00:04:39 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.775 00:04:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.775 00:04:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.775 00:04:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.775 00:04:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.775 00:04:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.775 00:04:39 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.775 00:04:39 -- accel/accel.sh@42 -- # jq -r . 00:06:14.775 [2024-11-30 00:04:39.991225] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.775 [2024-11-30 00:04:39.991306] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710130 ] 00:06:14.775 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.775 [2024-11-30 00:04:40.064124] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.775 [2024-11-30 00:04:40.140173] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.178 00:04:41 -- accel/accel.sh@18 -- # out=' 00:06:16.178 SPDK Configuration: 00:06:16.178 Core mask: 0x1 00:06:16.178 00:06:16.178 Accel Perf Configuration: 00:06:16.178 Workload Type: copy 00:06:16.178 Transfer size: 4096 bytes 00:06:16.178 Vector count 1 00:06:16.178 Module: software 00:06:16.178 Queue depth: 32 00:06:16.178 Allocate depth: 32 00:06:16.178 # threads/core: 1 00:06:16.178 Run time: 1 seconds 00:06:16.178 Verify: Yes 00:06:16.178 00:06:16.178 Running for 1 seconds... 00:06:16.178 00:06:16.178 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:16.178 ------------------------------------------------------------------------------------ 00:06:16.178 0,0 540960/s 2113 MiB/s 0 0 00:06:16.178 ==================================================================================== 00:06:16.178 Total 540960/s 2113 MiB/s 0 0' 00:06:16.178 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.178 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.178 00:04:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:16.178 00:04:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:16.178 00:04:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.178 00:04:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.178 00:04:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.178 00:04:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.178 00:04:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.178 00:04:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.178 00:04:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.178 00:04:41 -- accel/accel.sh@42 -- # jq -r . 00:06:16.178 [2024-11-30 00:04:41.330656] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.178 [2024-11-30 00:04:41.330752] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710377 ] 00:06:16.178 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.178 [2024-11-30 00:04:41.399005] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.178 [2024-11-30 00:04:41.465975] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.178 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.178 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.178 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.178 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.178 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.178 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.178 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.178 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.178 00:04:41 -- accel/accel.sh@21 -- # val=0x1 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val=copy 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val=software 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@23 -- # accel_module=software 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val=32 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val=32 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val=1 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val=Yes 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:16.179 00:04:41 -- accel/accel.sh@21 -- # val= 00:06:16.179 00:04:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # IFS=: 00:06:16.179 00:04:41 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@21 -- # val= 00:06:17.116 00:04:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # IFS=: 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@21 -- # val= 00:06:17.116 00:04:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # IFS=: 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@21 -- # val= 00:06:17.116 00:04:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # IFS=: 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@21 -- # val= 00:06:17.116 00:04:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # IFS=: 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@21 -- # val= 00:06:17.116 00:04:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # IFS=: 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@21 -- # val= 00:06:17.116 00:04:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # IFS=: 00:06:17.116 00:04:42 -- accel/accel.sh@20 -- # read -r var val 00:06:17.116 00:04:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:17.116 00:04:42 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:17.116 00:04:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:17.116 00:06:17.116 real 0m2.672s 00:06:17.116 user 0m2.424s 00:06:17.116 sys 0m0.258s 00:06:17.116 00:04:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:17.116 00:04:42 -- common/autotest_common.sh@10 -- # set +x 00:06:17.116 ************************************ 00:06:17.116 END TEST accel_copy 00:06:17.116 ************************************ 00:06:17.375 00:04:42 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:17.375 00:04:42 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:17.375 00:04:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:17.375 00:04:42 -- common/autotest_common.sh@10 -- # set +x 00:06:17.375 ************************************ 00:06:17.375 START TEST accel_fill 00:06:17.375 ************************************ 00:06:17.375 00:04:42 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:17.375 00:04:42 -- accel/accel.sh@16 -- # local accel_opc 00:06:17.375 00:04:42 -- accel/accel.sh@17 -- # local accel_module 00:06:17.375 00:04:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:17.375 00:04:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:17.375 00:04:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:17.375 00:04:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:17.375 00:04:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:17.375 00:04:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:17.375 00:04:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:17.375 00:04:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:17.375 00:04:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:17.375 00:04:42 -- accel/accel.sh@42 -- # jq -r . 00:06:17.375 [2024-11-30 00:04:42.711534] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:17.375 [2024-11-30 00:04:42.711629] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710660 ] 00:06:17.375 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.375 [2024-11-30 00:04:42.782009] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.375 [2024-11-30 00:04:42.849611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.753 00:04:44 -- accel/accel.sh@18 -- # out=' 00:06:18.753 SPDK Configuration: 00:06:18.753 Core mask: 0x1 00:06:18.753 00:06:18.753 Accel Perf Configuration: 00:06:18.753 Workload Type: fill 00:06:18.753 Fill pattern: 0x80 00:06:18.753 Transfer size: 4096 bytes 00:06:18.753 Vector count 1 00:06:18.753 Module: software 00:06:18.753 Queue depth: 64 00:06:18.753 Allocate depth: 64 00:06:18.753 # threads/core: 1 00:06:18.753 Run time: 1 seconds 00:06:18.753 Verify: Yes 00:06:18.753 00:06:18.753 Running for 1 seconds... 00:06:18.753 00:06:18.753 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:18.753 ------------------------------------------------------------------------------------ 00:06:18.753 0,0 961024/s 3754 MiB/s 0 0 00:06:18.753 ==================================================================================== 00:06:18.753 Total 961024/s 3754 MiB/s 0 0' 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.753 00:04:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:18.753 00:04:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:18.753 00:04:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.753 00:04:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.753 00:04:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.753 00:04:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.753 00:04:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.753 00:04:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.753 00:04:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.753 00:04:44 -- accel/accel.sh@42 -- # jq -r . 00:06:18.753 [2024-11-30 00:04:44.038549] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:18.753 [2024-11-30 00:04:44.038647] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710926 ] 00:06:18.753 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.753 [2024-11-30 00:04:44.107110] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.753 [2024-11-30 00:04:44.173362] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.753 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.753 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.753 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.753 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.753 00:04:44 -- accel/accel.sh@21 -- # val=0x1 00:06:18.753 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.753 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.753 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.753 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.753 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.753 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.753 00:04:44 -- accel/accel.sh@21 -- # val=fill 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val=0x80 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val=software 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val=64 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val=64 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val=1 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val=Yes 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:18.754 00:04:44 -- accel/accel.sh@21 -- # val= 00:06:18.754 00:04:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # IFS=: 00:06:18.754 00:04:44 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@21 -- # val= 00:06:20.144 00:04:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # IFS=: 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@21 -- # val= 00:06:20.144 00:04:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # IFS=: 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@21 -- # val= 00:06:20.144 00:04:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # IFS=: 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@21 -- # val= 00:06:20.144 00:04:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # IFS=: 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@21 -- # val= 00:06:20.144 00:04:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # IFS=: 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@21 -- # val= 00:06:20.144 00:04:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # IFS=: 00:06:20.144 00:04:45 -- accel/accel.sh@20 -- # read -r var val 00:06:20.144 00:04:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:20.145 00:04:45 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:20.145 00:04:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:20.145 00:06:20.145 real 0m2.659s 00:06:20.145 user 0m2.418s 00:06:20.145 sys 0m0.249s 00:06:20.145 00:04:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:20.145 00:04:45 -- common/autotest_common.sh@10 -- # set +x 00:06:20.145 ************************************ 00:06:20.145 END TEST accel_fill 00:06:20.145 ************************************ 00:06:20.145 00:04:45 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:20.145 00:04:45 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:20.145 00:04:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:20.145 00:04:45 -- common/autotest_common.sh@10 -- # set +x 00:06:20.145 ************************************ 00:06:20.145 START TEST accel_copy_crc32c 00:06:20.145 ************************************ 00:06:20.145 00:04:45 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:20.145 00:04:45 -- accel/accel.sh@16 -- # local accel_opc 00:06:20.145 00:04:45 -- accel/accel.sh@17 -- # local accel_module 00:06:20.145 00:04:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:20.145 00:04:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:20.145 00:04:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.145 00:04:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.145 00:04:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.145 00:04:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.145 00:04:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.145 00:04:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.145 00:04:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.145 00:04:45 -- accel/accel.sh@42 -- # jq -r . 00:06:20.145 [2024-11-30 00:04:45.419884] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:20.145 [2024-11-30 00:04:45.419966] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711217 ] 00:06:20.145 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.145 [2024-11-30 00:04:45.489971] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.145 [2024-11-30 00:04:45.557773] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.523 00:04:46 -- accel/accel.sh@18 -- # out=' 00:06:21.523 SPDK Configuration: 00:06:21.523 Core mask: 0x1 00:06:21.523 00:06:21.523 Accel Perf Configuration: 00:06:21.523 Workload Type: copy_crc32c 00:06:21.523 CRC-32C seed: 0 00:06:21.523 Vector size: 4096 bytes 00:06:21.523 Transfer size: 4096 bytes 00:06:21.523 Vector count 1 00:06:21.523 Module: software 00:06:21.523 Queue depth: 32 00:06:21.523 Allocate depth: 32 00:06:21.523 # threads/core: 1 00:06:21.523 Run time: 1 seconds 00:06:21.523 Verify: Yes 00:06:21.523 00:06:21.523 Running for 1 seconds... 00:06:21.523 00:06:21.523 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:21.523 ------------------------------------------------------------------------------------ 00:06:21.523 0,0 437504/s 1709 MiB/s 0 0 00:06:21.523 ==================================================================================== 00:06:21.523 Total 437504/s 1709 MiB/s 0 0' 00:06:21.523 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.523 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.523 00:04:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:21.523 00:04:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:21.523 00:04:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:21.523 00:04:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:21.523 00:04:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.523 00:04:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.523 00:04:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:21.523 00:04:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:21.523 00:04:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:21.523 00:04:46 -- accel/accel.sh@42 -- # jq -r . 00:06:21.523 [2024-11-30 00:04:46.748972] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:21.523 [2024-11-30 00:04:46.749064] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711486 ] 00:06:21.523 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.523 [2024-11-30 00:04:46.818198] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.523 [2024-11-30 00:04:46.884321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.523 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.523 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.523 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.523 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.523 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=0x1 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=0 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=software 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@23 -- # accel_module=software 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=32 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=32 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=1 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val=Yes 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:21.524 00:04:46 -- accel/accel.sh@21 -- # val= 00:06:21.524 00:04:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # IFS=: 00:06:21.524 00:04:46 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@21 -- # val= 00:06:22.903 00:04:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # IFS=: 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@21 -- # val= 00:06:22.903 00:04:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # IFS=: 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@21 -- # val= 00:06:22.903 00:04:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # IFS=: 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@21 -- # val= 00:06:22.903 00:04:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # IFS=: 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@21 -- # val= 00:06:22.903 00:04:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # IFS=: 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@21 -- # val= 00:06:22.903 00:04:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # IFS=: 00:06:22.903 00:04:48 -- accel/accel.sh@20 -- # read -r var val 00:06:22.903 00:04:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:22.903 00:04:48 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:22.903 00:04:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:22.903 00:06:22.903 real 0m2.663s 00:06:22.903 user 0m2.411s 00:06:22.903 sys 0m0.260s 00:06:22.903 00:04:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:22.903 00:04:48 -- common/autotest_common.sh@10 -- # set +x 00:06:22.903 ************************************ 00:06:22.903 END TEST accel_copy_crc32c 00:06:22.903 ************************************ 00:06:22.903 00:04:48 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:22.903 00:04:48 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:22.903 00:04:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:22.903 00:04:48 -- common/autotest_common.sh@10 -- # set +x 00:06:22.903 ************************************ 00:06:22.903 START TEST accel_copy_crc32c_C2 00:06:22.903 ************************************ 00:06:22.903 00:04:48 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:22.903 00:04:48 -- accel/accel.sh@16 -- # local accel_opc 00:06:22.903 00:04:48 -- accel/accel.sh@17 -- # local accel_module 00:06:22.903 00:04:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:22.904 00:04:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:22.904 00:04:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.904 00:04:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.904 00:04:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.904 00:04:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.904 00:04:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.904 00:04:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.904 00:04:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.904 00:04:48 -- accel/accel.sh@42 -- # jq -r . 00:06:22.904 [2024-11-30 00:04:48.132885] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:22.904 [2024-11-30 00:04:48.132976] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711753 ] 00:06:22.904 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.904 [2024-11-30 00:04:48.203445] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.904 [2024-11-30 00:04:48.270779] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.283 00:04:49 -- accel/accel.sh@18 -- # out=' 00:06:24.283 SPDK Configuration: 00:06:24.283 Core mask: 0x1 00:06:24.283 00:06:24.283 Accel Perf Configuration: 00:06:24.283 Workload Type: copy_crc32c 00:06:24.283 CRC-32C seed: 0 00:06:24.283 Vector size: 4096 bytes 00:06:24.283 Transfer size: 8192 bytes 00:06:24.283 Vector count 2 00:06:24.283 Module: software 00:06:24.283 Queue depth: 32 00:06:24.283 Allocate depth: 32 00:06:24.283 # threads/core: 1 00:06:24.283 Run time: 1 seconds 00:06:24.283 Verify: Yes 00:06:24.283 00:06:24.283 Running for 1 seconds... 00:06:24.283 00:06:24.283 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:24.283 ------------------------------------------------------------------------------------ 00:06:24.283 0,0 298016/s 2328 MiB/s 0 0 00:06:24.283 ==================================================================================== 00:06:24.283 Total 298016/s 1164 MiB/s 0 0' 00:06:24.283 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.283 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.283 00:04:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:24.283 00:04:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:24.283 00:04:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.283 00:04:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.283 00:04:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.283 00:04:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.283 00:04:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.283 00:04:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.283 00:04:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.284 00:04:49 -- accel/accel.sh@42 -- # jq -r . 00:06:24.284 [2024-11-30 00:04:49.463002] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.284 [2024-11-30 00:04:49.463097] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2711925 ] 00:06:24.284 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.284 [2024-11-30 00:04:49.531827] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.284 [2024-11-30 00:04:49.600221] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=0x1 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=0 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=software 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@23 -- # accel_module=software 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=32 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=32 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=1 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val=Yes 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:24.284 00:04:49 -- accel/accel.sh@21 -- # val= 00:06:24.284 00:04:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # IFS=: 00:06:24.284 00:04:49 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@21 -- # val= 00:06:25.222 00:04:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # IFS=: 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@21 -- # val= 00:06:25.222 00:04:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # IFS=: 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@21 -- # val= 00:06:25.222 00:04:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # IFS=: 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@21 -- # val= 00:06:25.222 00:04:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # IFS=: 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@21 -- # val= 00:06:25.222 00:04:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # IFS=: 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@21 -- # val= 00:06:25.222 00:04:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # IFS=: 00:06:25.222 00:04:50 -- accel/accel.sh@20 -- # read -r var val 00:06:25.222 00:04:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:25.222 00:04:50 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:25.222 00:04:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:25.222 00:06:25.222 real 0m2.664s 00:06:25.222 user 0m2.423s 00:06:25.222 sys 0m0.251s 00:06:25.222 00:04:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:25.222 00:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:25.222 ************************************ 00:06:25.222 END TEST accel_copy_crc32c_C2 00:06:25.222 ************************************ 00:06:25.484 00:04:50 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:25.484 00:04:50 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:25.485 00:04:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:25.485 00:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:25.485 ************************************ 00:06:25.485 START TEST accel_dualcast 00:06:25.485 ************************************ 00:06:25.485 00:04:50 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:25.485 00:04:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:25.485 00:04:50 -- accel/accel.sh@17 -- # local accel_module 00:06:25.485 00:04:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:25.485 00:04:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:25.485 00:04:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.485 00:04:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:25.485 00:04:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.485 00:04:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.485 00:04:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:25.485 00:04:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:25.485 00:04:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:25.485 00:04:50 -- accel/accel.sh@42 -- # jq -r . 00:06:25.485 [2024-11-30 00:04:50.845234] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:25.485 [2024-11-30 00:04:50.845316] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712123 ] 00:06:25.485 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.485 [2024-11-30 00:04:50.915308] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.485 [2024-11-30 00:04:50.984649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.862 00:04:52 -- accel/accel.sh@18 -- # out=' 00:06:26.862 SPDK Configuration: 00:06:26.862 Core mask: 0x1 00:06:26.862 00:06:26.862 Accel Perf Configuration: 00:06:26.862 Workload Type: dualcast 00:06:26.862 Transfer size: 4096 bytes 00:06:26.862 Vector count 1 00:06:26.862 Module: software 00:06:26.862 Queue depth: 32 00:06:26.862 Allocate depth: 32 00:06:26.862 # threads/core: 1 00:06:26.862 Run time: 1 seconds 00:06:26.862 Verify: Yes 00:06:26.862 00:06:26.862 Running for 1 seconds... 00:06:26.862 00:06:26.862 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:26.862 ------------------------------------------------------------------------------------ 00:06:26.862 0,0 622624/s 2432 MiB/s 0 0 00:06:26.862 ==================================================================================== 00:06:26.862 Total 622624/s 2432 MiB/s 0 0' 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:26.862 00:04:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:26.862 00:04:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.862 00:04:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.862 00:04:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.862 00:04:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.862 00:04:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.862 00:04:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.862 00:04:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.862 00:04:52 -- accel/accel.sh@42 -- # jq -r . 00:06:26.862 [2024-11-30 00:04:52.173942] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:26.862 [2024-11-30 00:04:52.174046] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712344 ] 00:06:26.862 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.862 [2024-11-30 00:04:52.243317] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.862 [2024-11-30 00:04:52.314209] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=0x1 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=dualcast 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=software 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@23 -- # accel_module=software 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=32 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=32 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=1 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val=Yes 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.862 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.862 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.862 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.863 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.863 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:26.863 00:04:52 -- accel/accel.sh@21 -- # val= 00:06:26.863 00:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.863 00:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:26.863 00:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@21 -- # val= 00:06:28.240 00:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@21 -- # val= 00:06:28.240 00:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@21 -- # val= 00:06:28.240 00:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@21 -- # val= 00:06:28.240 00:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@21 -- # val= 00:06:28.240 00:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@21 -- # val= 00:06:28.240 00:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:28.240 00:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:28.240 00:04:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:28.240 00:04:53 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:28.240 00:04:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:28.240 00:06:28.240 real 0m2.665s 00:06:28.240 user 0m2.430s 00:06:28.240 sys 0m0.242s 00:06:28.240 00:04:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:28.240 00:04:53 -- common/autotest_common.sh@10 -- # set +x 00:06:28.240 ************************************ 00:06:28.240 END TEST accel_dualcast 00:06:28.240 ************************************ 00:06:28.240 00:04:53 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:28.240 00:04:53 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:28.240 00:04:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:28.240 00:04:53 -- common/autotest_common.sh@10 -- # set +x 00:06:28.240 ************************************ 00:06:28.240 START TEST accel_compare 00:06:28.240 ************************************ 00:06:28.240 00:04:53 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:28.240 00:04:53 -- accel/accel.sh@16 -- # local accel_opc 00:06:28.240 00:04:53 -- accel/accel.sh@17 -- # local accel_module 00:06:28.240 00:04:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:28.240 00:04:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:28.240 00:04:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.240 00:04:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.240 00:04:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.240 00:04:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.240 00:04:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.240 00:04:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.240 00:04:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.240 00:04:53 -- accel/accel.sh@42 -- # jq -r . 00:06:28.240 [2024-11-30 00:04:53.560298] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:28.240 [2024-11-30 00:04:53.560391] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712639 ] 00:06:28.240 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.240 [2024-11-30 00:04:53.628644] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.240 [2024-11-30 00:04:53.697380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.618 00:04:54 -- accel/accel.sh@18 -- # out=' 00:06:29.618 SPDK Configuration: 00:06:29.618 Core mask: 0x1 00:06:29.618 00:06:29.618 Accel Perf Configuration: 00:06:29.618 Workload Type: compare 00:06:29.618 Transfer size: 4096 bytes 00:06:29.618 Vector count 1 00:06:29.618 Module: software 00:06:29.618 Queue depth: 32 00:06:29.618 Allocate depth: 32 00:06:29.618 # threads/core: 1 00:06:29.618 Run time: 1 seconds 00:06:29.618 Verify: Yes 00:06:29.618 00:06:29.618 Running for 1 seconds... 00:06:29.618 00:06:29.618 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:29.618 ------------------------------------------------------------------------------------ 00:06:29.618 0,0 811776/s 3171 MiB/s 0 0 00:06:29.618 ==================================================================================== 00:06:29.618 Total 811776/s 3171 MiB/s 0 0' 00:06:29.618 00:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:29.618 00:04:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:29.618 00:04:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.618 00:04:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.618 00:04:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.618 00:04:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.618 00:04:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.618 00:04:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.618 00:04:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.618 00:04:54 -- accel/accel.sh@42 -- # jq -r . 00:06:29.618 [2024-11-30 00:04:54.886516] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.618 [2024-11-30 00:04:54.886632] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2712908 ] 00:06:29.618 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.618 [2024-11-30 00:04:54.954080] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.618 [2024-11-30 00:04:55.020664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val=0x1 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val=compare 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val=software 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@23 -- # accel_module=software 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val=32 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val=32 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.618 00:04:55 -- accel/accel.sh@21 -- # val=1 00:06:29.618 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.618 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.619 00:04:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:29.619 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.619 00:04:55 -- accel/accel.sh@21 -- # val=Yes 00:06:29.619 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.619 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.619 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:29.619 00:04:55 -- accel/accel.sh@21 -- # val= 00:06:29.619 00:04:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # IFS=: 00:06:29.619 00:04:55 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@21 -- # val= 00:06:31.009 00:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@21 -- # val= 00:06:31.009 00:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@21 -- # val= 00:06:31.009 00:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@21 -- # val= 00:06:31.009 00:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@21 -- # val= 00:06:31.009 00:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@21 -- # val= 00:06:31.009 00:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:31.009 00:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:31.009 00:04:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:31.009 00:04:56 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:31.009 00:04:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:31.009 00:06:31.009 real 0m2.655s 00:06:31.009 user 0m2.405s 00:06:31.009 sys 0m0.259s 00:06:31.009 00:04:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.009 00:04:56 -- common/autotest_common.sh@10 -- # set +x 00:06:31.009 ************************************ 00:06:31.009 END TEST accel_compare 00:06:31.009 ************************************ 00:06:31.009 00:04:56 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:31.009 00:04:56 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:31.009 00:04:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.009 00:04:56 -- common/autotest_common.sh@10 -- # set +x 00:06:31.009 ************************************ 00:06:31.009 START TEST accel_xor 00:06:31.009 ************************************ 00:06:31.009 00:04:56 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:31.009 00:04:56 -- accel/accel.sh@16 -- # local accel_opc 00:06:31.009 00:04:56 -- accel/accel.sh@17 -- # local accel_module 00:06:31.009 00:04:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:31.009 00:04:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:31.009 00:04:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.009 00:04:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.009 00:04:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.009 00:04:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.009 00:04:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.009 00:04:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.009 00:04:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.009 00:04:56 -- accel/accel.sh@42 -- # jq -r . 00:06:31.009 [2024-11-30 00:04:56.264831] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:31.009 [2024-11-30 00:04:56.264922] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713189 ] 00:06:31.009 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.009 [2024-11-30 00:04:56.334831] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.009 [2024-11-30 00:04:56.402330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.388 00:04:57 -- accel/accel.sh@18 -- # out=' 00:06:32.388 SPDK Configuration: 00:06:32.388 Core mask: 0x1 00:06:32.388 00:06:32.388 Accel Perf Configuration: 00:06:32.388 Workload Type: xor 00:06:32.388 Source buffers: 2 00:06:32.388 Transfer size: 4096 bytes 00:06:32.388 Vector count 1 00:06:32.388 Module: software 00:06:32.388 Queue depth: 32 00:06:32.388 Allocate depth: 32 00:06:32.388 # threads/core: 1 00:06:32.388 Run time: 1 seconds 00:06:32.388 Verify: Yes 00:06:32.388 00:06:32.388 Running for 1 seconds... 00:06:32.388 00:06:32.388 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:32.388 ------------------------------------------------------------------------------------ 00:06:32.388 0,0 709152/s 2770 MiB/s 0 0 00:06:32.388 ==================================================================================== 00:06:32.388 Total 709152/s 2770 MiB/s 0 0' 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:32.388 00:04:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:32.388 00:04:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.388 00:04:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.388 00:04:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.388 00:04:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.388 00:04:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.388 00:04:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.388 00:04:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.388 00:04:57 -- accel/accel.sh@42 -- # jq -r . 00:06:32.388 [2024-11-30 00:04:57.591263] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:32.388 [2024-11-30 00:04:57.591358] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713458 ] 00:06:32.388 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.388 [2024-11-30 00:04:57.660678] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.388 [2024-11-30 00:04:57.726662] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=0x1 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=xor 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=2 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=software 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@23 -- # accel_module=software 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=32 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=32 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=1 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val=Yes 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:32.388 00:04:57 -- accel/accel.sh@21 -- # val= 00:06:32.388 00:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:32.388 00:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:33.766 00:04:58 -- accel/accel.sh@21 -- # val= 00:06:33.766 00:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.766 00:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:33.766 00:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:33.767 00:04:58 -- accel/accel.sh@21 -- # val= 00:06:33.767 00:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:33.767 00:04:58 -- accel/accel.sh@21 -- # val= 00:06:33.767 00:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:33.767 00:04:58 -- accel/accel.sh@21 -- # val= 00:06:33.767 00:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:33.767 00:04:58 -- accel/accel.sh@21 -- # val= 00:06:33.767 00:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:33.767 00:04:58 -- accel/accel.sh@21 -- # val= 00:06:33.767 00:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:33.767 00:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:33.767 00:04:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:33.767 00:04:58 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:33.767 00:04:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:33.767 00:06:33.767 real 0m2.660s 00:06:33.767 user 0m2.398s 00:06:33.767 sys 0m0.269s 00:06:33.767 00:04:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.767 00:04:58 -- common/autotest_common.sh@10 -- # set +x 00:06:33.767 ************************************ 00:06:33.767 END TEST accel_xor 00:06:33.767 ************************************ 00:06:33.767 00:04:58 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:33.767 00:04:58 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:33.767 00:04:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.767 00:04:58 -- common/autotest_common.sh@10 -- # set +x 00:06:33.767 ************************************ 00:06:33.767 START TEST accel_xor 00:06:33.767 ************************************ 00:06:33.767 00:04:58 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:06:33.767 00:04:58 -- accel/accel.sh@16 -- # local accel_opc 00:06:33.767 00:04:58 -- accel/accel.sh@17 -- # local accel_module 00:06:33.767 00:04:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:33.767 00:04:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:33.767 00:04:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.767 00:04:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.767 00:04:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.767 00:04:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.767 00:04:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.767 00:04:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.767 00:04:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.767 00:04:58 -- accel/accel.sh@42 -- # jq -r . 00:06:33.767 [2024-11-30 00:04:58.974574] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:33.767 [2024-11-30 00:04:58.974674] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713745 ] 00:06:33.767 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.767 [2024-11-30 00:04:59.045707] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.767 [2024-11-30 00:04:59.111209] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.144 00:05:00 -- accel/accel.sh@18 -- # out=' 00:06:35.144 SPDK Configuration: 00:06:35.144 Core mask: 0x1 00:06:35.144 00:06:35.144 Accel Perf Configuration: 00:06:35.144 Workload Type: xor 00:06:35.144 Source buffers: 3 00:06:35.144 Transfer size: 4096 bytes 00:06:35.144 Vector count 1 00:06:35.144 Module: software 00:06:35.144 Queue depth: 32 00:06:35.144 Allocate depth: 32 00:06:35.144 # threads/core: 1 00:06:35.144 Run time: 1 seconds 00:06:35.144 Verify: Yes 00:06:35.144 00:06:35.144 Running for 1 seconds... 00:06:35.144 00:06:35.144 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:35.144 ------------------------------------------------------------------------------------ 00:06:35.144 0,0 660992/s 2582 MiB/s 0 0 00:06:35.145 ==================================================================================== 00:06:35.145 Total 660992/s 2582 MiB/s 0 0' 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:35.145 00:05:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:35.145 00:05:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.145 00:05:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.145 00:05:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.145 00:05:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.145 00:05:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.145 00:05:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.145 00:05:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.145 00:05:00 -- accel/accel.sh@42 -- # jq -r . 00:06:35.145 [2024-11-30 00:05:00.302411] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:35.145 [2024-11-30 00:05:00.302503] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2713913 ] 00:06:35.145 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.145 [2024-11-30 00:05:00.371158] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.145 [2024-11-30 00:05:00.439403] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=0x1 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=xor 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=3 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=software 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@23 -- # accel_module=software 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=32 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=32 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=1 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val=Yes 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:35.145 00:05:00 -- accel/accel.sh@21 -- # val= 00:06:35.145 00:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:35.145 00:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@21 -- # val= 00:06:36.080 00:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@21 -- # val= 00:06:36.080 00:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@21 -- # val= 00:06:36.080 00:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@21 -- # val= 00:06:36.080 00:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@21 -- # val= 00:06:36.080 00:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@21 -- # val= 00:06:36.080 00:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:36.080 00:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:36.080 00:05:01 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:36.080 00:05:01 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:36.080 00:05:01 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:36.080 00:06:36.080 real 0m2.662s 00:06:36.080 user 0m2.408s 00:06:36.080 sys 0m0.264s 00:06:36.080 00:05:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.080 00:05:01 -- common/autotest_common.sh@10 -- # set +x 00:06:36.080 ************************************ 00:06:36.080 END TEST accel_xor 00:06:36.080 ************************************ 00:06:36.339 00:05:01 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:36.339 00:05:01 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:36.339 00:05:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.339 00:05:01 -- common/autotest_common.sh@10 -- # set +x 00:06:36.339 ************************************ 00:06:36.339 START TEST accel_dif_verify 00:06:36.339 ************************************ 00:06:36.339 00:05:01 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:06:36.339 00:05:01 -- accel/accel.sh@16 -- # local accel_opc 00:06:36.339 00:05:01 -- accel/accel.sh@17 -- # local accel_module 00:06:36.339 00:05:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:36.339 00:05:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:36.339 00:05:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.339 00:05:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.339 00:05:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.339 00:05:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.339 00:05:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.339 00:05:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.339 00:05:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.339 00:05:01 -- accel/accel.sh@42 -- # jq -r . 00:06:36.339 [2024-11-30 00:05:01.685830] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:36.339 [2024-11-30 00:05:01.685918] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714117 ] 00:06:36.339 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.339 [2024-11-30 00:05:01.754441] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.339 [2024-11-30 00:05:01.822655] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.739 00:05:02 -- accel/accel.sh@18 -- # out=' 00:06:37.739 SPDK Configuration: 00:06:37.739 Core mask: 0x1 00:06:37.739 00:06:37.739 Accel Perf Configuration: 00:06:37.739 Workload Type: dif_verify 00:06:37.739 Vector size: 4096 bytes 00:06:37.739 Transfer size: 4096 bytes 00:06:37.739 Block size: 512 bytes 00:06:37.739 Metadata size: 8 bytes 00:06:37.739 Vector count 1 00:06:37.739 Module: software 00:06:37.739 Queue depth: 32 00:06:37.739 Allocate depth: 32 00:06:37.739 # threads/core: 1 00:06:37.739 Run time: 1 seconds 00:06:37.739 Verify: No 00:06:37.739 00:06:37.739 Running for 1 seconds... 00:06:37.739 00:06:37.739 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:37.739 ------------------------------------------------------------------------------------ 00:06:37.739 0,0 247808/s 983 MiB/s 0 0 00:06:37.739 ==================================================================================== 00:06:37.739 Total 247808/s 968 MiB/s 0 0' 00:06:37.739 00:05:02 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:02 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:37.739 00:05:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:37.739 00:05:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.739 00:05:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.739 00:05:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.739 00:05:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.739 00:05:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.739 00:05:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.739 00:05:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.739 00:05:02 -- accel/accel.sh@42 -- # jq -r . 00:06:37.739 [2024-11-30 00:05:03.014386] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.739 [2024-11-30 00:05:03.014474] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714326 ] 00:06:37.739 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.739 [2024-11-30 00:05:03.083770] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.739 [2024-11-30 00:05:03.150432] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val=0x1 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val=dif_verify 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val=software 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.739 00:05:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.739 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.739 00:05:03 -- accel/accel.sh@21 -- # val=32 00:06:37.739 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.740 00:05:03 -- accel/accel.sh@21 -- # val=32 00:06:37.740 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.740 00:05:03 -- accel/accel.sh@21 -- # val=1 00:06:37.740 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.740 00:05:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:37.740 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.740 00:05:03 -- accel/accel.sh@21 -- # val=No 00:06:37.740 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.740 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.740 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:37.740 00:05:03 -- accel/accel.sh@21 -- # val= 00:06:37.740 00:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:37.740 00:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@21 -- # val= 00:06:38.813 00:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@21 -- # val= 00:06:38.813 00:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@21 -- # val= 00:06:38.813 00:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@21 -- # val= 00:06:38.813 00:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@21 -- # val= 00:06:38.813 00:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@21 -- # val= 00:06:38.813 00:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:38.813 00:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:38.813 00:05:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:38.813 00:05:04 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:38.813 00:05:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.813 00:06:38.813 real 0m2.660s 00:06:38.813 user 0m2.418s 00:06:38.813 sys 0m0.253s 00:06:38.813 00:05:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.813 00:05:04 -- common/autotest_common.sh@10 -- # set +x 00:06:38.813 ************************************ 00:06:38.813 END TEST accel_dif_verify 00:06:38.813 ************************************ 00:06:38.813 00:05:04 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:38.813 00:05:04 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:38.813 00:05:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.813 00:05:04 -- common/autotest_common.sh@10 -- # set +x 00:06:39.071 ************************************ 00:06:39.071 START TEST accel_dif_generate 00:06:39.071 ************************************ 00:06:39.071 00:05:04 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:06:39.071 00:05:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:39.071 00:05:04 -- accel/accel.sh@17 -- # local accel_module 00:06:39.071 00:05:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:39.071 00:05:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:39.071 00:05:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.071 00:05:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.071 00:05:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.071 00:05:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.071 00:05:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.072 00:05:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.072 00:05:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.072 00:05:04 -- accel/accel.sh@42 -- # jq -r . 00:06:39.072 [2024-11-30 00:05:04.394639] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:39.072 [2024-11-30 00:05:04.394731] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714629 ] 00:06:39.072 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.072 [2024-11-30 00:05:04.463200] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.072 [2024-11-30 00:05:04.529393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.454 00:05:05 -- accel/accel.sh@18 -- # out=' 00:06:40.454 SPDK Configuration: 00:06:40.454 Core mask: 0x1 00:06:40.454 00:06:40.454 Accel Perf Configuration: 00:06:40.454 Workload Type: dif_generate 00:06:40.454 Vector size: 4096 bytes 00:06:40.454 Transfer size: 4096 bytes 00:06:40.454 Block size: 512 bytes 00:06:40.454 Metadata size: 8 bytes 00:06:40.454 Vector count 1 00:06:40.454 Module: software 00:06:40.454 Queue depth: 32 00:06:40.454 Allocate depth: 32 00:06:40.454 # threads/core: 1 00:06:40.454 Run time: 1 seconds 00:06:40.454 Verify: No 00:06:40.454 00:06:40.454 Running for 1 seconds... 00:06:40.454 00:06:40.454 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:40.454 ------------------------------------------------------------------------------------ 00:06:40.454 0,0 289344/s 1147 MiB/s 0 0 00:06:40.454 ==================================================================================== 00:06:40.454 Total 289344/s 1130 MiB/s 0 0' 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:40.454 00:05:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:40.454 00:05:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.454 00:05:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.454 00:05:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.454 00:05:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.454 00:05:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.454 00:05:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.454 00:05:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.454 00:05:05 -- accel/accel.sh@42 -- # jq -r . 00:06:40.454 [2024-11-30 00:05:05.719728] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:40.454 [2024-11-30 00:05:05.719820] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714897 ] 00:06:40.454 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.454 [2024-11-30 00:05:05.788808] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.454 [2024-11-30 00:05:05.854981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=0x1 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=dif_generate 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=software 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@23 -- # accel_module=software 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=32 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=32 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=1 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val=No 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:40.454 00:05:05 -- accel/accel.sh@21 -- # val= 00:06:40.454 00:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:40.454 00:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@21 -- # val= 00:06:41.833 00:05:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # IFS=: 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@21 -- # val= 00:06:41.833 00:05:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # IFS=: 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@21 -- # val= 00:06:41.833 00:05:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # IFS=: 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@21 -- # val= 00:06:41.833 00:05:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # IFS=: 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@21 -- # val= 00:06:41.833 00:05:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # IFS=: 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@21 -- # val= 00:06:41.833 00:05:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # IFS=: 00:06:41.833 00:05:07 -- accel/accel.sh@20 -- # read -r var val 00:06:41.833 00:05:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:41.833 00:05:07 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:41.833 00:05:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.833 00:06:41.833 real 0m2.652s 00:06:41.833 user 0m2.405s 00:06:41.833 sys 0m0.247s 00:06:41.833 00:05:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.833 00:05:07 -- common/autotest_common.sh@10 -- # set +x 00:06:41.833 ************************************ 00:06:41.833 END TEST accel_dif_generate 00:06:41.833 ************************************ 00:06:41.833 00:05:07 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:41.833 00:05:07 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:41.833 00:05:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.833 00:05:07 -- common/autotest_common.sh@10 -- # set +x 00:06:41.833 ************************************ 00:06:41.833 START TEST accel_dif_generate_copy 00:06:41.833 ************************************ 00:06:41.833 00:05:07 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:06:41.833 00:05:07 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.833 00:05:07 -- accel/accel.sh@17 -- # local accel_module 00:06:41.833 00:05:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:41.833 00:05:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:41.834 00:05:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.834 00:05:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.834 00:05:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.834 00:05:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.834 00:05:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.834 00:05:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.834 00:05:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.834 00:05:07 -- accel/accel.sh@42 -- # jq -r . 00:06:41.834 [2024-11-30 00:05:07.086156] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:41.834 [2024-11-30 00:05:07.086245] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715186 ] 00:06:41.834 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.834 [2024-11-30 00:05:07.154254] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.834 [2024-11-30 00:05:07.222677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.213 00:05:08 -- accel/accel.sh@18 -- # out=' 00:06:43.213 SPDK Configuration: 00:06:43.213 Core mask: 0x1 00:06:43.213 00:06:43.213 Accel Perf Configuration: 00:06:43.213 Workload Type: dif_generate_copy 00:06:43.213 Vector size: 4096 bytes 00:06:43.213 Transfer size: 4096 bytes 00:06:43.213 Vector count 1 00:06:43.213 Module: software 00:06:43.213 Queue depth: 32 00:06:43.213 Allocate depth: 32 00:06:43.213 # threads/core: 1 00:06:43.213 Run time: 1 seconds 00:06:43.213 Verify: No 00:06:43.213 00:06:43.213 Running for 1 seconds... 00:06:43.213 00:06:43.213 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:43.213 ------------------------------------------------------------------------------------ 00:06:43.213 0,0 227872/s 904 MiB/s 0 0 00:06:43.213 ==================================================================================== 00:06:43.213 Total 227872/s 890 MiB/s 0 0' 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.213 00:05:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.213 00:05:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:43.213 00:05:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.213 00:05:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.213 00:05:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.213 00:05:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.213 00:05:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.213 00:05:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.213 00:05:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.213 00:05:08 -- accel/accel.sh@42 -- # jq -r . 00:06:43.213 [2024-11-30 00:05:08.399402] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:43.213 [2024-11-30 00:05:08.399456] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715455 ] 00:06:43.213 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.213 [2024-11-30 00:05:08.462502] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.213 [2024-11-30 00:05:08.528687] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.213 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.213 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.213 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.213 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.213 00:05:08 -- accel/accel.sh@21 -- # val=0x1 00:06:43.213 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.213 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.213 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.213 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.213 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val=software 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val=32 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val=32 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val=1 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val=No 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:43.214 00:05:08 -- accel/accel.sh@21 -- # val= 00:06:43.214 00:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:43.214 00:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@21 -- # val= 00:06:44.156 00:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@21 -- # val= 00:06:44.156 00:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@21 -- # val= 00:06:44.156 00:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@21 -- # val= 00:06:44.156 00:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@21 -- # val= 00:06:44.156 00:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@21 -- # val= 00:06:44.156 00:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:44.156 00:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:44.156 00:05:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:44.156 00:05:09 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:44.156 00:05:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:44.156 00:06:44.156 real 0m2.633s 00:06:44.156 user 0m2.387s 00:06:44.156 sys 0m0.243s 00:06:44.156 00:05:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:44.156 00:05:09 -- common/autotest_common.sh@10 -- # set +x 00:06:44.156 ************************************ 00:06:44.156 END TEST accel_dif_generate_copy 00:06:44.156 ************************************ 00:06:44.419 00:05:09 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:44.419 00:05:09 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:44.419 00:05:09 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:44.419 00:05:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:44.419 00:05:09 -- common/autotest_common.sh@10 -- # set +x 00:06:44.419 ************************************ 00:06:44.419 START TEST accel_comp 00:06:44.419 ************************************ 00:06:44.419 00:05:09 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:44.419 00:05:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:44.419 00:05:09 -- accel/accel.sh@17 -- # local accel_module 00:06:44.419 00:05:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:44.419 00:05:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:44.419 00:05:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.419 00:05:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.419 00:05:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.419 00:05:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.419 00:05:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.419 00:05:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.419 00:05:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.419 00:05:09 -- accel/accel.sh@42 -- # jq -r . 00:06:44.419 [2024-11-30 00:05:09.760634] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:44.419 [2024-11-30 00:05:09.760723] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715738 ] 00:06:44.419 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.419 [2024-11-30 00:05:09.831335] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.419 [2024-11-30 00:05:09.898879] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.797 00:05:11 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:45.797 00:06:45.797 SPDK Configuration: 00:06:45.797 Core mask: 0x1 00:06:45.797 00:06:45.797 Accel Perf Configuration: 00:06:45.797 Workload Type: compress 00:06:45.797 Transfer size: 4096 bytes 00:06:45.797 Vector count 1 00:06:45.797 Module: software 00:06:45.797 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.797 Queue depth: 32 00:06:45.797 Allocate depth: 32 00:06:45.797 # threads/core: 1 00:06:45.797 Run time: 1 seconds 00:06:45.797 Verify: No 00:06:45.797 00:06:45.797 Running for 1 seconds... 00:06:45.797 00:06:45.797 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.797 ------------------------------------------------------------------------------------ 00:06:45.797 0,0 66144/s 275 MiB/s 0 0 00:06:45.797 ==================================================================================== 00:06:45.797 Total 66144/s 258 MiB/s 0 0' 00:06:45.797 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.797 00:05:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.797 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.797 00:05:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.797 00:05:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.797 00:05:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.797 00:05:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.797 00:05:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.797 00:05:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.797 00:05:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.797 00:05:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.797 00:05:11 -- accel/accel.sh@42 -- # jq -r . 00:06:45.797 [2024-11-30 00:05:11.080531] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:45.797 [2024-11-30 00:05:11.080585] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715935 ] 00:06:45.797 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.797 [2024-11-30 00:05:11.145673] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.797 [2024-11-30 00:05:11.213394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.797 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.797 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.797 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.797 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.797 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.797 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.797 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.797 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.797 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.797 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=0x1 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=compress 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=software 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=32 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=32 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=1 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val=No 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:45.798 00:05:11 -- accel/accel.sh@21 -- # val= 00:06:45.798 00:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:45.798 00:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@21 -- # val= 00:06:47.175 00:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@21 -- # val= 00:06:47.175 00:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@21 -- # val= 00:06:47.175 00:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@21 -- # val= 00:06:47.175 00:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@21 -- # val= 00:06:47.175 00:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@21 -- # val= 00:06:47.175 00:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:47.175 00:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:47.175 00:05:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:47.175 00:05:12 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:47.175 00:05:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.175 00:06:47.175 real 0m2.648s 00:06:47.175 user 0m2.404s 00:06:47.175 sys 0m0.242s 00:06:47.175 00:05:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:47.175 00:05:12 -- common/autotest_common.sh@10 -- # set +x 00:06:47.175 ************************************ 00:06:47.175 END TEST accel_comp 00:06:47.175 ************************************ 00:06:47.175 00:05:12 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:47.175 00:05:12 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:47.175 00:05:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:47.175 00:05:12 -- common/autotest_common.sh@10 -- # set +x 00:06:47.175 ************************************ 00:06:47.175 START TEST accel_decomp 00:06:47.175 ************************************ 00:06:47.175 00:05:12 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:47.175 00:05:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.175 00:05:12 -- accel/accel.sh@17 -- # local accel_module 00:06:47.175 00:05:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:47.175 00:05:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:47.175 00:05:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.175 00:05:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.175 00:05:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.175 00:05:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.175 00:05:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.175 00:05:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.175 00:05:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.175 00:05:12 -- accel/accel.sh@42 -- # jq -r . 00:06:47.175 [2024-11-30 00:05:12.445702] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:47.175 [2024-11-30 00:05:12.445803] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716125 ] 00:06:47.175 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.175 [2024-11-30 00:05:12.513814] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.175 [2024-11-30 00:05:12.582678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.555 00:05:13 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:48.555 00:06:48.555 SPDK Configuration: 00:06:48.555 Core mask: 0x1 00:06:48.555 00:06:48.555 Accel Perf Configuration: 00:06:48.555 Workload Type: decompress 00:06:48.555 Transfer size: 4096 bytes 00:06:48.555 Vector count 1 00:06:48.555 Module: software 00:06:48.555 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:48.555 Queue depth: 32 00:06:48.555 Allocate depth: 32 00:06:48.555 # threads/core: 1 00:06:48.555 Run time: 1 seconds 00:06:48.555 Verify: Yes 00:06:48.555 00:06:48.555 Running for 1 seconds... 00:06:48.555 00:06:48.555 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:48.555 ------------------------------------------------------------------------------------ 00:06:48.555 0,0 90976/s 167 MiB/s 0 0 00:06:48.555 ==================================================================================== 00:06:48.555 Total 90976/s 355 MiB/s 0 0' 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:48.555 00:05:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.555 00:05:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.555 00:05:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.555 00:05:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.555 00:05:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.555 00:05:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.555 00:05:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.555 00:05:13 -- accel/accel.sh@42 -- # jq -r . 00:06:48.555 [2024-11-30 00:05:13.762806] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:48.555 [2024-11-30 00:05:13.762860] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716313 ] 00:06:48.555 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.555 [2024-11-30 00:05:13.828322] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.555 [2024-11-30 00:05:13.899172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.555 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.555 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.555 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val=0x1 00:06:48.555 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.555 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.555 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.555 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.555 00:05:13 -- accel/accel.sh@21 -- # val=decompress 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val=software 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val=32 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val=32 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val=1 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val=Yes 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:48.556 00:05:13 -- accel/accel.sh@21 -- # val= 00:06:48.556 00:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:48.556 00:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:49.935 00:05:15 -- accel/accel.sh@21 -- # val= 00:06:49.935 00:05:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # IFS=: 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # read -r var val 00:06:49.935 00:05:15 -- accel/accel.sh@21 -- # val= 00:06:49.935 00:05:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # IFS=: 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # read -r var val 00:06:49.935 00:05:15 -- accel/accel.sh@21 -- # val= 00:06:49.935 00:05:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # IFS=: 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # read -r var val 00:06:49.935 00:05:15 -- accel/accel.sh@21 -- # val= 00:06:49.935 00:05:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.935 00:05:15 -- accel/accel.sh@20 -- # IFS=: 00:06:49.936 00:05:15 -- accel/accel.sh@20 -- # read -r var val 00:06:49.936 00:05:15 -- accel/accel.sh@21 -- # val= 00:06:49.936 00:05:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.936 00:05:15 -- accel/accel.sh@20 -- # IFS=: 00:06:49.936 00:05:15 -- accel/accel.sh@20 -- # read -r var val 00:06:49.936 00:05:15 -- accel/accel.sh@21 -- # val= 00:06:49.936 00:05:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.936 00:05:15 -- accel/accel.sh@20 -- # IFS=: 00:06:49.936 00:05:15 -- accel/accel.sh@20 -- # read -r var val 00:06:49.936 00:05:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.936 00:05:15 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:49.936 00:05:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.936 00:06:49.936 real 0m2.646s 00:06:49.936 user 0m2.405s 00:06:49.936 sys 0m0.240s 00:06:49.936 00:05:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.936 00:05:15 -- common/autotest_common.sh@10 -- # set +x 00:06:49.936 ************************************ 00:06:49.936 END TEST accel_decomp 00:06:49.936 ************************************ 00:06:49.936 00:05:15 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:49.936 00:05:15 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:49.936 00:05:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:49.936 00:05:15 -- common/autotest_common.sh@10 -- # set +x 00:06:49.936 ************************************ 00:06:49.936 START TEST accel_decmop_full 00:06:49.936 ************************************ 00:06:49.936 00:05:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:49.936 00:05:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.936 00:05:15 -- accel/accel.sh@17 -- # local accel_module 00:06:49.936 00:05:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:49.936 00:05:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:49.936 00:05:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.936 00:05:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.936 00:05:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.936 00:05:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.936 00:05:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.936 00:05:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.936 00:05:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.936 00:05:15 -- accel/accel.sh@42 -- # jq -r . 00:06:49.936 [2024-11-30 00:05:15.133297] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:49.936 [2024-11-30 00:05:15.133393] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716599 ] 00:06:49.936 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.936 [2024-11-30 00:05:15.203593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.936 [2024-11-30 00:05:15.271565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.323 00:05:16 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:51.323 00:06:51.323 SPDK Configuration: 00:06:51.323 Core mask: 0x1 00:06:51.323 00:06:51.323 Accel Perf Configuration: 00:06:51.323 Workload Type: decompress 00:06:51.323 Transfer size: 111250 bytes 00:06:51.323 Vector count 1 00:06:51.323 Module: software 00:06:51.323 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:51.323 Queue depth: 32 00:06:51.323 Allocate depth: 32 00:06:51.323 # threads/core: 1 00:06:51.323 Run time: 1 seconds 00:06:51.323 Verify: Yes 00:06:51.323 00:06:51.323 Running for 1 seconds... 00:06:51.323 00:06:51.323 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:51.323 ------------------------------------------------------------------------------------ 00:06:51.323 0,0 5856/s 241 MiB/s 0 0 00:06:51.323 ==================================================================================== 00:06:51.323 Total 5856/s 621 MiB/s 0 0' 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:51.323 00:05:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.323 00:05:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.323 00:05:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.323 00:05:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.323 00:05:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.323 00:05:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.323 00:05:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.323 00:05:16 -- accel/accel.sh@42 -- # jq -r . 00:06:51.323 [2024-11-30 00:05:16.459885] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:51.323 [2024-11-30 00:05:16.459938] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716877 ] 00:06:51.323 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.323 [2024-11-30 00:05:16.523055] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.323 [2024-11-30 00:05:16.594042] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val=0x1 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val=decompress 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val=software 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val=32 00:06:51.323 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.323 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.323 00:05:16 -- accel/accel.sh@21 -- # val=32 00:06:51.324 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.324 00:05:16 -- accel/accel.sh@21 -- # val=1 00:06:51.324 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.324 00:05:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:51.324 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.324 00:05:16 -- accel/accel.sh@21 -- # val=Yes 00:06:51.324 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.324 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.324 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:51.324 00:05:16 -- accel/accel.sh@21 -- # val= 00:06:51.324 00:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:51.324 00:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@21 -- # val= 00:06:52.261 00:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@21 -- # val= 00:06:52.261 00:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@21 -- # val= 00:06:52.261 00:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@21 -- # val= 00:06:52.261 00:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@21 -- # val= 00:06:52.261 00:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@21 -- # val= 00:06:52.261 00:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:52.261 00:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:52.261 00:05:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:52.261 00:05:17 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:52.261 00:05:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.261 00:06:52.261 real 0m2.665s 00:06:52.261 user 0m2.421s 00:06:52.261 sys 0m0.240s 00:06:52.261 00:05:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.261 00:05:17 -- common/autotest_common.sh@10 -- # set +x 00:06:52.261 ************************************ 00:06:52.261 END TEST accel_decmop_full 00:06:52.261 ************************************ 00:06:52.261 00:05:17 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:52.261 00:05:17 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:52.261 00:05:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.261 00:05:17 -- common/autotest_common.sh@10 -- # set +x 00:06:52.519 ************************************ 00:06:52.520 START TEST accel_decomp_mcore 00:06:52.520 ************************************ 00:06:52.520 00:05:17 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:52.520 00:05:17 -- accel/accel.sh@16 -- # local accel_opc 00:06:52.520 00:05:17 -- accel/accel.sh@17 -- # local accel_module 00:06:52.520 00:05:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:52.520 00:05:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:52.520 00:05:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.520 00:05:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.520 00:05:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.520 00:05:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.520 00:05:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.520 00:05:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.520 00:05:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.520 00:05:17 -- accel/accel.sh@42 -- # jq -r . 00:06:52.520 [2024-11-30 00:05:17.839765] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:52.520 [2024-11-30 00:05:17.839851] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717160 ] 00:06:52.520 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.520 [2024-11-30 00:05:17.910990] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:52.520 [2024-11-30 00:05:17.982055] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.520 [2024-11-30 00:05:17.982150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:52.520 [2024-11-30 00:05:17.982231] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:52.520 [2024-11-30 00:05:17.982235] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.900 00:05:19 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:53.900 00:06:53.900 SPDK Configuration: 00:06:53.900 Core mask: 0xf 00:06:53.900 00:06:53.900 Accel Perf Configuration: 00:06:53.900 Workload Type: decompress 00:06:53.900 Transfer size: 4096 bytes 00:06:53.900 Vector count 1 00:06:53.900 Module: software 00:06:53.900 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:53.900 Queue depth: 32 00:06:53.900 Allocate depth: 32 00:06:53.900 # threads/core: 1 00:06:53.900 Run time: 1 seconds 00:06:53.900 Verify: Yes 00:06:53.900 00:06:53.900 Running for 1 seconds... 00:06:53.900 00:06:53.900 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.900 ------------------------------------------------------------------------------------ 00:06:53.900 0,0 74720/s 137 MiB/s 0 0 00:06:53.900 3,0 75168/s 138 MiB/s 0 0 00:06:53.900 2,0 75264/s 138 MiB/s 0 0 00:06:53.900 1,0 75456/s 139 MiB/s 0 0 00:06:53.900 ==================================================================================== 00:06:53.900 Total 300608/s 1174 MiB/s 0 0' 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:53.900 00:05:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:53.900 00:05:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.900 00:05:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.900 00:05:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.900 00:05:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.900 00:05:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.900 00:05:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.900 00:05:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.900 00:05:19 -- accel/accel.sh@42 -- # jq -r . 00:06:53.900 [2024-11-30 00:05:19.179991] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:53.900 [2024-11-30 00:05:19.180087] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717429 ] 00:06:53.900 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.900 [2024-11-30 00:05:19.247552] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:53.900 [2024-11-30 00:05:19.316963] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:53.900 [2024-11-30 00:05:19.317057] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:53.900 [2024-11-30 00:05:19.317144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:53.900 [2024-11-30 00:05:19.317147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val=0xf 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val=decompress 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val=software 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val=32 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.900 00:05:19 -- accel/accel.sh@21 -- # val=32 00:06:53.900 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.900 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.901 00:05:19 -- accel/accel.sh@21 -- # val=1 00:06:53.901 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.901 00:05:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.901 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.901 00:05:19 -- accel/accel.sh@21 -- # val=Yes 00:06:53.901 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.901 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.901 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:53.901 00:05:19 -- accel/accel.sh@21 -- # val= 00:06:53.901 00:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # IFS=: 00:06:53.901 00:05:19 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.282 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.282 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.282 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.283 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.283 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.283 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.283 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.283 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.283 00:05:20 -- accel/accel.sh@21 -- # val= 00:06:55.283 00:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.283 00:05:20 -- accel/accel.sh@20 -- # IFS=: 00:06:55.283 00:05:20 -- accel/accel.sh@20 -- # read -r var val 00:06:55.283 00:05:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:55.283 00:05:20 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:55.283 00:05:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.283 00:06:55.283 real 0m2.688s 00:06:55.283 user 0m9.076s 00:06:55.283 sys 0m0.274s 00:06:55.283 00:05:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.283 00:05:20 -- common/autotest_common.sh@10 -- # set +x 00:06:55.283 ************************************ 00:06:55.283 END TEST accel_decomp_mcore 00:06:55.283 ************************************ 00:06:55.283 00:05:20 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:55.283 00:05:20 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:55.283 00:05:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.283 00:05:20 -- common/autotest_common.sh@10 -- # set +x 00:06:55.283 ************************************ 00:06:55.283 START TEST accel_decomp_full_mcore 00:06:55.283 ************************************ 00:06:55.283 00:05:20 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:55.283 00:05:20 -- accel/accel.sh@16 -- # local accel_opc 00:06:55.283 00:05:20 -- accel/accel.sh@17 -- # local accel_module 00:06:55.283 00:05:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:55.283 00:05:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:55.283 00:05:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.283 00:05:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.283 00:05:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.283 00:05:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.283 00:05:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.283 00:05:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.283 00:05:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.283 00:05:20 -- accel/accel.sh@42 -- # jq -r . 00:06:55.283 [2024-11-30 00:05:20.574575] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:55.283 [2024-11-30 00:05:20.574676] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717720 ] 00:06:55.283 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.283 [2024-11-30 00:05:20.643405] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:55.283 [2024-11-30 00:05:20.713621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.283 [2024-11-30 00:05:20.713678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:55.283 [2024-11-30 00:05:20.713764] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:55.283 [2024-11-30 00:05:20.713766] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.660 00:05:21 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:56.660 00:06:56.660 SPDK Configuration: 00:06:56.660 Core mask: 0xf 00:06:56.660 00:06:56.660 Accel Perf Configuration: 00:06:56.660 Workload Type: decompress 00:06:56.660 Transfer size: 111250 bytes 00:06:56.660 Vector count 1 00:06:56.660 Module: software 00:06:56.660 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:56.660 Queue depth: 32 00:06:56.660 Allocate depth: 32 00:06:56.661 # threads/core: 1 00:06:56.661 Run time: 1 seconds 00:06:56.661 Verify: Yes 00:06:56.661 00:06:56.661 Running for 1 seconds... 00:06:56.661 00:06:56.661 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.661 ------------------------------------------------------------------------------------ 00:06:56.661 0,0 5792/s 239 MiB/s 0 0 00:06:56.661 3,0 5824/s 240 MiB/s 0 0 00:06:56.661 2,0 5824/s 240 MiB/s 0 0 00:06:56.661 1,0 5824/s 240 MiB/s 0 0 00:06:56.661 ==================================================================================== 00:06:56.661 Total 23264/s 2468 MiB/s 0 0' 00:06:56.661 00:05:21 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:21 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:56.661 00:05:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:56.661 00:05:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.661 00:05:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.661 00:05:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.661 00:05:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.661 00:05:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.661 00:05:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.661 00:05:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.661 00:05:21 -- accel/accel.sh@42 -- # jq -r . 00:06:56.661 [2024-11-30 00:05:21.923637] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:56.661 [2024-11-30 00:05:21.923714] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717902 ] 00:06:56.661 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.661 [2024-11-30 00:05:21.993871] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:56.661 [2024-11-30 00:05:22.064607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.661 [2024-11-30 00:05:22.064693] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:56.661 [2024-11-30 00:05:22.064713] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:56.661 [2024-11-30 00:05:22.064715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=0xf 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=decompress 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=software 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=32 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=32 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=1 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val=Yes 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:56.661 00:05:22 -- accel/accel.sh@21 -- # val= 00:06:56.661 00:05:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # IFS=: 00:06:56.661 00:05:22 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@21 -- # val= 00:06:58.045 00:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # IFS=: 00:06:58.045 00:05:23 -- accel/accel.sh@20 -- # read -r var val 00:06:58.045 00:05:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:58.045 00:05:23 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:58.045 00:05:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.045 00:06:58.045 real 0m2.707s 00:06:58.045 user 0m9.131s 00:06:58.045 sys 0m0.287s 00:06:58.045 00:05:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.045 00:05:23 -- common/autotest_common.sh@10 -- # set +x 00:06:58.045 ************************************ 00:06:58.045 END TEST accel_decomp_full_mcore 00:06:58.045 ************************************ 00:06:58.045 00:05:23 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:58.045 00:05:23 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:58.045 00:05:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.045 00:05:23 -- common/autotest_common.sh@10 -- # set +x 00:06:58.045 ************************************ 00:06:58.045 START TEST accel_decomp_mthread 00:06:58.045 ************************************ 00:06:58.045 00:05:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:58.045 00:05:23 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.045 00:05:23 -- accel/accel.sh@17 -- # local accel_module 00:06:58.045 00:05:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:58.045 00:05:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:58.045 00:05:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.045 00:05:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.045 00:05:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.045 00:05:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.045 00:05:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.045 00:05:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.045 00:05:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.045 00:05:23 -- accel/accel.sh@42 -- # jq -r . 00:06:58.045 [2024-11-30 00:05:23.331240] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.045 [2024-11-30 00:05:23.331323] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718126 ] 00:06:58.045 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.045 [2024-11-30 00:05:23.400086] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.045 [2024-11-30 00:05:23.468655] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.425 00:05:24 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:59.425 00:06:59.425 SPDK Configuration: 00:06:59.425 Core mask: 0x1 00:06:59.425 00:06:59.425 Accel Perf Configuration: 00:06:59.425 Workload Type: decompress 00:06:59.425 Transfer size: 4096 bytes 00:06:59.425 Vector count 1 00:06:59.425 Module: software 00:06:59.425 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:59.425 Queue depth: 32 00:06:59.425 Allocate depth: 32 00:06:59.425 # threads/core: 2 00:06:59.425 Run time: 1 seconds 00:06:59.425 Verify: Yes 00:06:59.425 00:06:59.425 Running for 1 seconds... 00:06:59.425 00:06:59.425 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.425 ------------------------------------------------------------------------------------ 00:06:59.425 0,1 46208/s 85 MiB/s 0 0 00:06:59.425 0,0 46048/s 84 MiB/s 0 0 00:06:59.425 ==================================================================================== 00:06:59.425 Total 92256/s 360 MiB/s 0 0' 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.425 00:05:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.425 00:05:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.425 00:05:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.425 00:05:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.425 00:05:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.425 00:05:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.425 00:05:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.425 00:05:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.425 00:05:24 -- accel/accel.sh@42 -- # jq -r . 00:06:59.425 [2024-11-30 00:05:24.663710] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:59.425 [2024-11-30 00:05:24.663789] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718308 ] 00:06:59.425 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.425 [2024-11-30 00:05:24.733608] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.425 [2024-11-30 00:05:24.800788] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=0x1 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=decompress 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=software 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@23 -- # accel_module=software 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=32 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=32 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=2 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val=Yes 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:06:59.425 00:05:24 -- accel/accel.sh@21 -- # val= 00:06:59.425 00:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # IFS=: 00:06:59.425 00:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@21 -- # val= 00:07:00.801 00:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:00.801 00:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:00.801 00:05:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.801 00:05:25 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:00.801 00:05:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.801 00:07:00.801 real 0m2.670s 00:07:00.801 user 0m2.421s 00:07:00.801 sys 0m0.257s 00:07:00.801 00:05:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:00.801 00:05:25 -- common/autotest_common.sh@10 -- # set +x 00:07:00.801 ************************************ 00:07:00.801 END TEST accel_decomp_mthread 00:07:00.801 ************************************ 00:07:00.801 00:05:26 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.801 00:05:26 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:00.801 00:05:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:00.801 00:05:26 -- common/autotest_common.sh@10 -- # set +x 00:07:00.801 ************************************ 00:07:00.801 START TEST accel_deomp_full_mthread 00:07:00.801 ************************************ 00:07:00.801 00:05:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.801 00:05:26 -- accel/accel.sh@16 -- # local accel_opc 00:07:00.801 00:05:26 -- accel/accel.sh@17 -- # local accel_module 00:07:00.801 00:05:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.801 00:05:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.801 00:05:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.801 00:05:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.801 00:05:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.801 00:05:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.801 00:05:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.801 00:05:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.801 00:05:26 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.801 00:05:26 -- accel/accel.sh@42 -- # jq -r . 00:07:00.801 [2024-11-30 00:05:26.050410] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:00.801 [2024-11-30 00:05:26.050498] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718589 ] 00:07:00.801 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.801 [2024-11-30 00:05:26.122281] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.801 [2024-11-30 00:05:26.189920] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.182 00:05:27 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:02.182 00:07:02.182 SPDK Configuration: 00:07:02.182 Core mask: 0x1 00:07:02.182 00:07:02.182 Accel Perf Configuration: 00:07:02.182 Workload Type: decompress 00:07:02.182 Transfer size: 111250 bytes 00:07:02.182 Vector count 1 00:07:02.182 Module: software 00:07:02.182 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:02.182 Queue depth: 32 00:07:02.182 Allocate depth: 32 00:07:02.182 # threads/core: 2 00:07:02.182 Run time: 1 seconds 00:07:02.182 Verify: Yes 00:07:02.182 00:07:02.182 Running for 1 seconds... 00:07:02.182 00:07:02.182 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:02.182 ------------------------------------------------------------------------------------ 00:07:02.182 0,1 2976/s 122 MiB/s 0 0 00:07:02.182 0,0 2944/s 121 MiB/s 0 0 00:07:02.182 ==================================================================================== 00:07:02.183 Total 5920/s 628 MiB/s 0 0' 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:02.183 00:05:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:02.183 00:05:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.183 00:05:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.183 00:05:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.183 00:05:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.183 00:05:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.183 00:05:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.183 00:05:27 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.183 00:05:27 -- accel/accel.sh@42 -- # jq -r . 00:07:02.183 [2024-11-30 00:05:27.401739] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:02.183 [2024-11-30 00:05:27.401839] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718857 ] 00:07:02.183 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.183 [2024-11-30 00:05:27.469698] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.183 [2024-11-30 00:05:27.535561] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=0x1 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=decompress 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=software 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=32 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=32 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=2 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val=Yes 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:02.183 00:05:27 -- accel/accel.sh@21 -- # val= 00:07:02.183 00:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:02.183 00:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@21 -- # val= 00:07:03.561 00:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:03.561 00:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:03.561 00:05:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:03.561 00:05:28 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:03.561 00:05:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:03.561 00:07:03.561 real 0m2.707s 00:07:03.561 user 0m2.455s 00:07:03.561 sys 0m0.260s 00:07:03.561 00:05:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:03.561 00:05:28 -- common/autotest_common.sh@10 -- # set +x 00:07:03.561 ************************************ 00:07:03.561 END TEST accel_deomp_full_mthread 00:07:03.561 ************************************ 00:07:03.561 00:05:28 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:03.561 00:05:28 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:03.561 00:05:28 -- accel/accel.sh@129 -- # build_accel_config 00:07:03.562 00:05:28 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:03.562 00:05:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.562 00:05:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:03.562 00:05:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.562 00:05:28 -- common/autotest_common.sh@10 -- # set +x 00:07:03.562 00:05:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.562 00:05:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.562 00:05:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.562 00:05:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.562 00:05:28 -- accel/accel.sh@42 -- # jq -r . 00:07:03.562 ************************************ 00:07:03.562 START TEST accel_dif_functional_tests 00:07:03.562 ************************************ 00:07:03.562 00:05:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:03.562 [2024-11-30 00:05:28.809286] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:03.562 [2024-11-30 00:05:28.809384] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719147 ] 00:07:03.562 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.562 [2024-11-30 00:05:28.878465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:03.562 [2024-11-30 00:05:28.949546] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:03.562 [2024-11-30 00:05:28.949643] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:03.562 [2024-11-30 00:05:28.949646] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.562 00:07:03.562 00:07:03.562 CUnit - A unit testing framework for C - Version 2.1-3 00:07:03.562 http://cunit.sourceforge.net/ 00:07:03.562 00:07:03.562 00:07:03.562 Suite: accel_dif 00:07:03.562 Test: verify: DIF generated, GUARD check ...passed 00:07:03.562 Test: verify: DIF generated, APPTAG check ...passed 00:07:03.562 Test: verify: DIF generated, REFTAG check ...passed 00:07:03.562 Test: verify: DIF not generated, GUARD check ...[2024-11-30 00:05:29.018664] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:03.562 [2024-11-30 00:05:29.018717] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:03.562 passed 00:07:03.562 Test: verify: DIF not generated, APPTAG check ...[2024-11-30 00:05:29.018759] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:03.562 [2024-11-30 00:05:29.018780] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:03.562 passed 00:07:03.562 Test: verify: DIF not generated, REFTAG check ...[2024-11-30 00:05:29.018805] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:03.562 [2024-11-30 00:05:29.018826] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:03.562 passed 00:07:03.562 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:03.562 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-30 00:05:29.018887] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:03.562 passed 00:07:03.562 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:03.562 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:03.562 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:03.562 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-30 00:05:29.019017] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:03.562 passed 00:07:03.562 Test: generate copy: DIF generated, GUARD check ...passed 00:07:03.562 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:03.562 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:03.562 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:03.562 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:03.562 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:03.562 Test: generate copy: iovecs-len validate ...[2024-11-30 00:05:29.019204] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:03.562 passed 00:07:03.562 Test: generate copy: buffer alignment validate ...passed 00:07:03.562 00:07:03.562 Run Summary: Type Total Ran Passed Failed Inactive 00:07:03.562 suites 1 1 n/a 0 0 00:07:03.562 tests 20 20 20 0 0 00:07:03.562 asserts 204 204 204 0 n/a 00:07:03.562 00:07:03.562 Elapsed time = 0.002 seconds 00:07:03.821 00:07:03.821 real 0m0.394s 00:07:03.821 user 0m0.584s 00:07:03.821 sys 0m0.164s 00:07:03.821 00:05:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:03.821 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:03.821 ************************************ 00:07:03.821 END TEST accel_dif_functional_tests 00:07:03.821 ************************************ 00:07:03.821 00:07:03.821 real 0m57.117s 00:07:03.821 user 1m4.783s 00:07:03.821 sys 0m7.039s 00:07:03.821 00:05:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:03.821 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:03.821 ************************************ 00:07:03.821 END TEST accel 00:07:03.821 ************************************ 00:07:03.821 00:05:29 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:03.821 00:05:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:03.821 00:05:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:03.821 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:03.821 ************************************ 00:07:03.821 START TEST accel_rpc 00:07:03.821 ************************************ 00:07:03.821 00:05:29 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:03.821 * Looking for test storage... 00:07:03.821 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:03.821 00:05:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:03.821 00:05:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:04.081 00:05:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:04.081 00:05:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:04.081 00:05:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:04.081 00:05:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:04.081 00:05:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:04.081 00:05:29 -- scripts/common.sh@335 -- # IFS=.-: 00:07:04.081 00:05:29 -- scripts/common.sh@335 -- # read -ra ver1 00:07:04.081 00:05:29 -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.081 00:05:29 -- scripts/common.sh@336 -- # read -ra ver2 00:07:04.081 00:05:29 -- scripts/common.sh@337 -- # local 'op=<' 00:07:04.081 00:05:29 -- scripts/common.sh@339 -- # ver1_l=2 00:07:04.081 00:05:29 -- scripts/common.sh@340 -- # ver2_l=1 00:07:04.081 00:05:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:04.081 00:05:29 -- scripts/common.sh@343 -- # case "$op" in 00:07:04.081 00:05:29 -- scripts/common.sh@344 -- # : 1 00:07:04.081 00:05:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:04.081 00:05:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.081 00:05:29 -- scripts/common.sh@364 -- # decimal 1 00:07:04.081 00:05:29 -- scripts/common.sh@352 -- # local d=1 00:07:04.081 00:05:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.081 00:05:29 -- scripts/common.sh@354 -- # echo 1 00:07:04.081 00:05:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:04.081 00:05:29 -- scripts/common.sh@365 -- # decimal 2 00:07:04.081 00:05:29 -- scripts/common.sh@352 -- # local d=2 00:07:04.081 00:05:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.081 00:05:29 -- scripts/common.sh@354 -- # echo 2 00:07:04.081 00:05:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:04.081 00:05:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:04.081 00:05:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:04.081 00:05:29 -- scripts/common.sh@367 -- # return 0 00:07:04.081 00:05:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.081 00:05:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:04.081 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.081 --rc genhtml_branch_coverage=1 00:07:04.081 --rc genhtml_function_coverage=1 00:07:04.081 --rc genhtml_legend=1 00:07:04.081 --rc geninfo_all_blocks=1 00:07:04.081 --rc geninfo_unexecuted_blocks=1 00:07:04.081 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.081 ' 00:07:04.081 00:05:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:04.081 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.081 --rc genhtml_branch_coverage=1 00:07:04.081 --rc genhtml_function_coverage=1 00:07:04.081 --rc genhtml_legend=1 00:07:04.081 --rc geninfo_all_blocks=1 00:07:04.081 --rc geninfo_unexecuted_blocks=1 00:07:04.081 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.081 ' 00:07:04.081 00:05:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:04.081 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.081 --rc genhtml_branch_coverage=1 00:07:04.081 --rc genhtml_function_coverage=1 00:07:04.081 --rc genhtml_legend=1 00:07:04.081 --rc geninfo_all_blocks=1 00:07:04.081 --rc geninfo_unexecuted_blocks=1 00:07:04.081 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.081 ' 00:07:04.081 00:05:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:04.081 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.081 --rc genhtml_branch_coverage=1 00:07:04.081 --rc genhtml_function_coverage=1 00:07:04.081 --rc genhtml_legend=1 00:07:04.081 --rc geninfo_all_blocks=1 00:07:04.081 --rc geninfo_unexecuted_blocks=1 00:07:04.081 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.081 ' 00:07:04.081 00:05:29 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:04.081 00:05:29 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2719297 00:07:04.081 00:05:29 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:04.081 00:05:29 -- accel/accel_rpc.sh@15 -- # waitforlisten 2719297 00:07:04.081 00:05:29 -- common/autotest_common.sh@829 -- # '[' -z 2719297 ']' 00:07:04.081 00:05:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:04.081 00:05:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:04.081 00:05:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:04.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:04.081 00:05:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:04.081 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.081 [2024-11-30 00:05:29.479930] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:04.081 [2024-11-30 00:05:29.480005] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719297 ] 00:07:04.082 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.082 [2024-11-30 00:05:29.546896] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.082 [2024-11-30 00:05:29.620895] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:04.082 [2024-11-30 00:05:29.621041] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.340 00:05:29 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:04.340 00:05:29 -- common/autotest_common.sh@862 -- # return 0 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:04.340 00:05:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:04.340 00:05:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.340 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.340 ************************************ 00:07:04.340 START TEST accel_assign_opcode 00:07:04.340 ************************************ 00:07:04.340 00:05:29 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:04.340 00:05:29 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:04.340 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.340 [2024-11-30 00:05:29.661484] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:04.340 00:05:29 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:04.340 00:05:29 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:04.340 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.340 [2024-11-30 00:05:29.669497] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:04.340 00:05:29 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:04.340 00:05:29 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:04.340 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.340 00:05:29 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:04.340 00:05:29 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:04.340 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:04.340 00:05:29 -- accel/accel_rpc.sh@42 -- # grep software 00:07:04.340 00:05:29 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:04.340 software 00:07:04.340 00:07:04.340 real 0m0.231s 00:07:04.340 user 0m0.038s 00:07:04.340 sys 0m0.011s 00:07:04.340 00:05:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.340 00:05:29 -- common/autotest_common.sh@10 -- # set +x 00:07:04.340 ************************************ 00:07:04.340 END TEST accel_assign_opcode 00:07:04.340 ************************************ 00:07:04.598 00:05:29 -- accel/accel_rpc.sh@55 -- # killprocess 2719297 00:07:04.598 00:05:29 -- common/autotest_common.sh@936 -- # '[' -z 2719297 ']' 00:07:04.598 00:05:29 -- common/autotest_common.sh@940 -- # kill -0 2719297 00:07:04.598 00:05:29 -- common/autotest_common.sh@941 -- # uname 00:07:04.598 00:05:29 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:04.598 00:05:29 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2719297 00:07:04.598 00:05:29 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:04.598 00:05:29 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:04.598 00:05:29 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2719297' 00:07:04.598 killing process with pid 2719297 00:07:04.598 00:05:29 -- common/autotest_common.sh@955 -- # kill 2719297 00:07:04.598 00:05:29 -- common/autotest_common.sh@960 -- # wait 2719297 00:07:04.858 00:07:04.858 real 0m1.029s 00:07:04.858 user 0m0.923s 00:07:04.858 sys 0m0.459s 00:07:04.858 00:05:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.858 00:05:30 -- common/autotest_common.sh@10 -- # set +x 00:07:04.858 ************************************ 00:07:04.858 END TEST accel_rpc 00:07:04.858 ************************************ 00:07:04.858 00:05:30 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:04.858 00:05:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:04.858 00:05:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.858 00:05:30 -- common/autotest_common.sh@10 -- # set +x 00:07:04.858 ************************************ 00:07:04.858 START TEST app_cmdline 00:07:04.858 ************************************ 00:07:04.858 00:05:30 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:05.117 * Looking for test storage... 00:07:05.117 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:05.117 00:05:30 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:05.117 00:05:30 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:05.117 00:05:30 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:05.117 00:05:30 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:05.117 00:05:30 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:05.117 00:05:30 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:05.117 00:05:30 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:05.117 00:05:30 -- scripts/common.sh@335 -- # IFS=.-: 00:07:05.117 00:05:30 -- scripts/common.sh@335 -- # read -ra ver1 00:07:05.117 00:05:30 -- scripts/common.sh@336 -- # IFS=.-: 00:07:05.117 00:05:30 -- scripts/common.sh@336 -- # read -ra ver2 00:07:05.117 00:05:30 -- scripts/common.sh@337 -- # local 'op=<' 00:07:05.117 00:05:30 -- scripts/common.sh@339 -- # ver1_l=2 00:07:05.117 00:05:30 -- scripts/common.sh@340 -- # ver2_l=1 00:07:05.117 00:05:30 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:05.117 00:05:30 -- scripts/common.sh@343 -- # case "$op" in 00:07:05.117 00:05:30 -- scripts/common.sh@344 -- # : 1 00:07:05.117 00:05:30 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:05.117 00:05:30 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:05.117 00:05:30 -- scripts/common.sh@364 -- # decimal 1 00:07:05.117 00:05:30 -- scripts/common.sh@352 -- # local d=1 00:07:05.117 00:05:30 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:05.117 00:05:30 -- scripts/common.sh@354 -- # echo 1 00:07:05.117 00:05:30 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:05.117 00:05:30 -- scripts/common.sh@365 -- # decimal 2 00:07:05.117 00:05:30 -- scripts/common.sh@352 -- # local d=2 00:07:05.117 00:05:30 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:05.117 00:05:30 -- scripts/common.sh@354 -- # echo 2 00:07:05.117 00:05:30 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:05.117 00:05:30 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:05.117 00:05:30 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:05.117 00:05:30 -- scripts/common.sh@367 -- # return 0 00:07:05.117 00:05:30 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:05.117 00:05:30 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:05.117 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.117 --rc genhtml_branch_coverage=1 00:07:05.117 --rc genhtml_function_coverage=1 00:07:05.117 --rc genhtml_legend=1 00:07:05.117 --rc geninfo_all_blocks=1 00:07:05.117 --rc geninfo_unexecuted_blocks=1 00:07:05.117 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.117 ' 00:07:05.117 00:05:30 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:05.117 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.117 --rc genhtml_branch_coverage=1 00:07:05.117 --rc genhtml_function_coverage=1 00:07:05.117 --rc genhtml_legend=1 00:07:05.117 --rc geninfo_all_blocks=1 00:07:05.117 --rc geninfo_unexecuted_blocks=1 00:07:05.117 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.117 ' 00:07:05.117 00:05:30 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:05.117 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.117 --rc genhtml_branch_coverage=1 00:07:05.117 --rc genhtml_function_coverage=1 00:07:05.117 --rc genhtml_legend=1 00:07:05.117 --rc geninfo_all_blocks=1 00:07:05.117 --rc geninfo_unexecuted_blocks=1 00:07:05.117 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.117 ' 00:07:05.117 00:05:30 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:05.117 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.117 --rc genhtml_branch_coverage=1 00:07:05.117 --rc genhtml_function_coverage=1 00:07:05.117 --rc genhtml_legend=1 00:07:05.117 --rc geninfo_all_blocks=1 00:07:05.117 --rc geninfo_unexecuted_blocks=1 00:07:05.117 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.117 ' 00:07:05.117 00:05:30 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:05.117 00:05:30 -- app/cmdline.sh@17 -- # spdk_tgt_pid=2719557 00:07:05.117 00:05:30 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:05.117 00:05:30 -- app/cmdline.sh@18 -- # waitforlisten 2719557 00:07:05.117 00:05:30 -- common/autotest_common.sh@829 -- # '[' -z 2719557 ']' 00:07:05.117 00:05:30 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.117 00:05:30 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:05.117 00:05:30 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.117 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.117 00:05:30 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:05.117 00:05:30 -- common/autotest_common.sh@10 -- # set +x 00:07:05.117 [2024-11-30 00:05:30.553977] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:05.117 [2024-11-30 00:05:30.554052] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719557 ] 00:07:05.117 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.117 [2024-11-30 00:05:30.613100] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.376 [2024-11-30 00:05:30.687718] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:05.376 [2024-11-30 00:05:30.687865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.944 00:05:31 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:05.944 00:05:31 -- common/autotest_common.sh@862 -- # return 0 00:07:05.944 00:05:31 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:06.202 { 00:07:06.202 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:06.202 "fields": { 00:07:06.202 "major": 24, 00:07:06.202 "minor": 1, 00:07:06.202 "patch": 1, 00:07:06.202 "suffix": "-pre", 00:07:06.202 "commit": "c13c99a5e" 00:07:06.202 } 00:07:06.202 } 00:07:06.202 00:05:31 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:06.202 00:05:31 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:06.202 00:05:31 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:06.202 00:05:31 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:06.202 00:05:31 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:06.202 00:05:31 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:06.202 00:05:31 -- app/cmdline.sh@26 -- # sort 00:07:06.202 00:05:31 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:06.202 00:05:31 -- common/autotest_common.sh@10 -- # set +x 00:07:06.202 00:05:31 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:06.202 00:05:31 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:06.202 00:05:31 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:06.202 00:05:31 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:06.202 00:05:31 -- common/autotest_common.sh@650 -- # local es=0 00:07:06.202 00:05:31 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:06.202 00:05:31 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.202 00:05:31 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:06.202 00:05:31 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.202 00:05:31 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:06.202 00:05:31 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.202 00:05:31 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:06.203 00:05:31 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.203 00:05:31 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:06.203 00:05:31 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:06.462 request: 00:07:06.462 { 00:07:06.462 "method": "env_dpdk_get_mem_stats", 00:07:06.462 "req_id": 1 00:07:06.462 } 00:07:06.462 Got JSON-RPC error response 00:07:06.462 response: 00:07:06.462 { 00:07:06.462 "code": -32601, 00:07:06.462 "message": "Method not found" 00:07:06.462 } 00:07:06.462 00:05:31 -- common/autotest_common.sh@653 -- # es=1 00:07:06.462 00:05:31 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:06.462 00:05:31 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:06.462 00:05:31 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:06.462 00:05:31 -- app/cmdline.sh@1 -- # killprocess 2719557 00:07:06.462 00:05:31 -- common/autotest_common.sh@936 -- # '[' -z 2719557 ']' 00:07:06.462 00:05:31 -- common/autotest_common.sh@940 -- # kill -0 2719557 00:07:06.462 00:05:31 -- common/autotest_common.sh@941 -- # uname 00:07:06.462 00:05:31 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:06.462 00:05:31 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2719557 00:07:06.462 00:05:31 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:06.462 00:05:31 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:06.462 00:05:31 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2719557' 00:07:06.462 killing process with pid 2719557 00:07:06.462 00:05:31 -- common/autotest_common.sh@955 -- # kill 2719557 00:07:06.462 00:05:31 -- common/autotest_common.sh@960 -- # wait 2719557 00:07:06.722 00:07:06.722 real 0m1.787s 00:07:06.722 user 0m2.105s 00:07:06.722 sys 0m0.473s 00:07:06.722 00:05:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:06.722 00:05:32 -- common/autotest_common.sh@10 -- # set +x 00:07:06.722 ************************************ 00:07:06.722 END TEST app_cmdline 00:07:06.722 ************************************ 00:07:06.722 00:05:32 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:06.722 00:05:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:06.722 00:05:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:06.722 00:05:32 -- common/autotest_common.sh@10 -- # set +x 00:07:06.722 ************************************ 00:07:06.722 START TEST version 00:07:06.722 ************************************ 00:07:06.722 00:05:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:06.722 * Looking for test storage... 00:07:06.982 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:06.982 00:05:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:06.982 00:05:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:06.982 00:05:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:06.982 00:05:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:06.982 00:05:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:06.982 00:05:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:06.982 00:05:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:06.982 00:05:32 -- scripts/common.sh@335 -- # IFS=.-: 00:07:06.982 00:05:32 -- scripts/common.sh@335 -- # read -ra ver1 00:07:06.982 00:05:32 -- scripts/common.sh@336 -- # IFS=.-: 00:07:06.982 00:05:32 -- scripts/common.sh@336 -- # read -ra ver2 00:07:06.982 00:05:32 -- scripts/common.sh@337 -- # local 'op=<' 00:07:06.982 00:05:32 -- scripts/common.sh@339 -- # ver1_l=2 00:07:06.982 00:05:32 -- scripts/common.sh@340 -- # ver2_l=1 00:07:06.982 00:05:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:06.982 00:05:32 -- scripts/common.sh@343 -- # case "$op" in 00:07:06.982 00:05:32 -- scripts/common.sh@344 -- # : 1 00:07:06.982 00:05:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:06.982 00:05:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:06.982 00:05:32 -- scripts/common.sh@364 -- # decimal 1 00:07:06.982 00:05:32 -- scripts/common.sh@352 -- # local d=1 00:07:06.982 00:05:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:06.982 00:05:32 -- scripts/common.sh@354 -- # echo 1 00:07:06.982 00:05:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:06.982 00:05:32 -- scripts/common.sh@365 -- # decimal 2 00:07:06.982 00:05:32 -- scripts/common.sh@352 -- # local d=2 00:07:06.982 00:05:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:06.982 00:05:32 -- scripts/common.sh@354 -- # echo 2 00:07:06.982 00:05:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:06.982 00:05:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:06.982 00:05:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:06.982 00:05:32 -- scripts/common.sh@367 -- # return 0 00:07:06.982 00:05:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:06.982 00:05:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:06.982 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.982 --rc genhtml_branch_coverage=1 00:07:06.982 --rc genhtml_function_coverage=1 00:07:06.982 --rc genhtml_legend=1 00:07:06.982 --rc geninfo_all_blocks=1 00:07:06.982 --rc geninfo_unexecuted_blocks=1 00:07:06.982 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.982 ' 00:07:06.982 00:05:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:06.982 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.982 --rc genhtml_branch_coverage=1 00:07:06.982 --rc genhtml_function_coverage=1 00:07:06.982 --rc genhtml_legend=1 00:07:06.982 --rc geninfo_all_blocks=1 00:07:06.982 --rc geninfo_unexecuted_blocks=1 00:07:06.982 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.982 ' 00:07:06.982 00:05:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:06.982 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.982 --rc genhtml_branch_coverage=1 00:07:06.982 --rc genhtml_function_coverage=1 00:07:06.982 --rc genhtml_legend=1 00:07:06.982 --rc geninfo_all_blocks=1 00:07:06.982 --rc geninfo_unexecuted_blocks=1 00:07:06.982 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.982 ' 00:07:06.982 00:05:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:06.982 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.982 --rc genhtml_branch_coverage=1 00:07:06.982 --rc genhtml_function_coverage=1 00:07:06.982 --rc genhtml_legend=1 00:07:06.982 --rc geninfo_all_blocks=1 00:07:06.982 --rc geninfo_unexecuted_blocks=1 00:07:06.982 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.982 ' 00:07:06.982 00:05:32 -- app/version.sh@17 -- # get_header_version major 00:07:06.982 00:05:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:06.982 00:05:32 -- app/version.sh@14 -- # cut -f2 00:07:06.982 00:05:32 -- app/version.sh@14 -- # tr -d '"' 00:07:06.982 00:05:32 -- app/version.sh@17 -- # major=24 00:07:06.982 00:05:32 -- app/version.sh@18 -- # get_header_version minor 00:07:06.982 00:05:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:06.982 00:05:32 -- app/version.sh@14 -- # cut -f2 00:07:06.982 00:05:32 -- app/version.sh@14 -- # tr -d '"' 00:07:06.982 00:05:32 -- app/version.sh@18 -- # minor=1 00:07:06.982 00:05:32 -- app/version.sh@19 -- # get_header_version patch 00:07:06.982 00:05:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:06.982 00:05:32 -- app/version.sh@14 -- # cut -f2 00:07:06.982 00:05:32 -- app/version.sh@14 -- # tr -d '"' 00:07:06.982 00:05:32 -- app/version.sh@19 -- # patch=1 00:07:06.982 00:05:32 -- app/version.sh@20 -- # get_header_version suffix 00:07:06.982 00:05:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:06.982 00:05:32 -- app/version.sh@14 -- # cut -f2 00:07:06.982 00:05:32 -- app/version.sh@14 -- # tr -d '"' 00:07:06.982 00:05:32 -- app/version.sh@20 -- # suffix=-pre 00:07:06.982 00:05:32 -- app/version.sh@22 -- # version=24.1 00:07:06.982 00:05:32 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:06.982 00:05:32 -- app/version.sh@25 -- # version=24.1.1 00:07:06.982 00:05:32 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:06.982 00:05:32 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:06.982 00:05:32 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:06.982 00:05:32 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:06.982 00:05:32 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:06.982 00:07:06.982 real 0m0.259s 00:07:06.982 user 0m0.143s 00:07:06.982 sys 0m0.166s 00:07:06.982 00:05:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:06.982 00:05:32 -- common/autotest_common.sh@10 -- # set +x 00:07:06.982 ************************************ 00:07:06.982 END TEST version 00:07:06.982 ************************************ 00:07:06.982 00:05:32 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@191 -- # uname -s 00:07:06.982 00:05:32 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:06.982 00:05:32 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:06.982 00:05:32 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:06.982 00:05:32 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:06.982 00:05:32 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:06.982 00:05:32 -- common/autotest_common.sh@10 -- # set +x 00:07:06.982 00:05:32 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:06.982 00:05:32 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:06.983 00:05:32 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:06.983 00:05:32 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:06.983 00:05:32 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:06.983 00:05:32 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:06.983 00:05:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:06.983 00:05:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:06.983 00:05:32 -- common/autotest_common.sh@10 -- # set +x 00:07:06.983 ************************************ 00:07:06.983 START TEST llvm_fuzz 00:07:06.983 ************************************ 00:07:06.983 00:05:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:07.252 * Looking for test storage... 00:07:07.253 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:07.253 00:05:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:07.253 00:05:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:07.253 00:05:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:07.253 00:05:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:07.253 00:05:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:07.253 00:05:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:07.253 00:05:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:07.253 00:05:32 -- scripts/common.sh@335 -- # IFS=.-: 00:07:07.253 00:05:32 -- scripts/common.sh@335 -- # read -ra ver1 00:07:07.253 00:05:32 -- scripts/common.sh@336 -- # IFS=.-: 00:07:07.253 00:05:32 -- scripts/common.sh@336 -- # read -ra ver2 00:07:07.253 00:05:32 -- scripts/common.sh@337 -- # local 'op=<' 00:07:07.253 00:05:32 -- scripts/common.sh@339 -- # ver1_l=2 00:07:07.253 00:05:32 -- scripts/common.sh@340 -- # ver2_l=1 00:07:07.253 00:05:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:07.253 00:05:32 -- scripts/common.sh@343 -- # case "$op" in 00:07:07.253 00:05:32 -- scripts/common.sh@344 -- # : 1 00:07:07.253 00:05:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:07.253 00:05:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:07.253 00:05:32 -- scripts/common.sh@364 -- # decimal 1 00:07:07.253 00:05:32 -- scripts/common.sh@352 -- # local d=1 00:07:07.253 00:05:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:07.253 00:05:32 -- scripts/common.sh@354 -- # echo 1 00:07:07.253 00:05:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:07.253 00:05:32 -- scripts/common.sh@365 -- # decimal 2 00:07:07.253 00:05:32 -- scripts/common.sh@352 -- # local d=2 00:07:07.254 00:05:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:07.254 00:05:32 -- scripts/common.sh@354 -- # echo 2 00:07:07.254 00:05:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:07.254 00:05:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:07.254 00:05:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:07.254 00:05:32 -- scripts/common.sh@367 -- # return 0 00:07:07.254 00:05:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:07.254 00:05:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:07.254 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.254 --rc genhtml_branch_coverage=1 00:07:07.254 --rc genhtml_function_coverage=1 00:07:07.254 --rc genhtml_legend=1 00:07:07.254 --rc geninfo_all_blocks=1 00:07:07.254 --rc geninfo_unexecuted_blocks=1 00:07:07.254 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.254 ' 00:07:07.254 00:05:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:07.254 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.254 --rc genhtml_branch_coverage=1 00:07:07.254 --rc genhtml_function_coverage=1 00:07:07.254 --rc genhtml_legend=1 00:07:07.254 --rc geninfo_all_blocks=1 00:07:07.254 --rc geninfo_unexecuted_blocks=1 00:07:07.254 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.254 ' 00:07:07.254 00:05:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:07.255 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.255 --rc genhtml_branch_coverage=1 00:07:07.255 --rc genhtml_function_coverage=1 00:07:07.255 --rc genhtml_legend=1 00:07:07.255 --rc geninfo_all_blocks=1 00:07:07.255 --rc geninfo_unexecuted_blocks=1 00:07:07.255 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.255 ' 00:07:07.255 00:05:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:07.255 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.255 --rc genhtml_branch_coverage=1 00:07:07.255 --rc genhtml_function_coverage=1 00:07:07.255 --rc genhtml_legend=1 00:07:07.255 --rc geninfo_all_blocks=1 00:07:07.255 --rc geninfo_unexecuted_blocks=1 00:07:07.255 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.255 ' 00:07:07.255 00:05:32 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:07.255 00:05:32 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:07.255 00:05:32 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:07.255 00:05:32 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:07.255 00:05:32 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:07.255 00:05:32 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:07.255 00:05:32 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:07.255 00:05:32 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:07.255 00:05:32 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:07.255 00:05:32 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:07.255 00:05:32 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:07.255 00:05:32 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:07.255 00:05:32 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:07.255 00:05:32 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:07.255 00:05:32 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:07.255 00:05:32 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:07.255 00:05:32 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:07.256 00:05:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:07.256 00:05:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:07.256 00:05:32 -- common/autotest_common.sh@10 -- # set +x 00:07:07.256 ************************************ 00:07:07.256 START TEST nvmf_fuzz 00:07:07.256 ************************************ 00:07:07.256 00:05:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:07.256 * Looking for test storage... 00:07:07.522 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.522 00:05:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:07.522 00:05:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:07.522 00:05:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:07.522 00:05:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:07.522 00:05:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:07.522 00:05:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:07.522 00:05:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:07.522 00:05:32 -- scripts/common.sh@335 -- # IFS=.-: 00:07:07.522 00:05:32 -- scripts/common.sh@335 -- # read -ra ver1 00:07:07.522 00:05:32 -- scripts/common.sh@336 -- # IFS=.-: 00:07:07.522 00:05:32 -- scripts/common.sh@336 -- # read -ra ver2 00:07:07.522 00:05:32 -- scripts/common.sh@337 -- # local 'op=<' 00:07:07.522 00:05:32 -- scripts/common.sh@339 -- # ver1_l=2 00:07:07.522 00:05:32 -- scripts/common.sh@340 -- # ver2_l=1 00:07:07.522 00:05:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:07.522 00:05:32 -- scripts/common.sh@343 -- # case "$op" in 00:07:07.522 00:05:32 -- scripts/common.sh@344 -- # : 1 00:07:07.522 00:05:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:07.522 00:05:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:07.522 00:05:32 -- scripts/common.sh@364 -- # decimal 1 00:07:07.522 00:05:32 -- scripts/common.sh@352 -- # local d=1 00:07:07.522 00:05:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:07.522 00:05:32 -- scripts/common.sh@354 -- # echo 1 00:07:07.522 00:05:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:07.522 00:05:32 -- scripts/common.sh@365 -- # decimal 2 00:07:07.522 00:05:32 -- scripts/common.sh@352 -- # local d=2 00:07:07.522 00:05:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:07.522 00:05:32 -- scripts/common.sh@354 -- # echo 2 00:07:07.522 00:05:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:07.522 00:05:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:07.522 00:05:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:07.522 00:05:32 -- scripts/common.sh@367 -- # return 0 00:07:07.522 00:05:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:07.522 00:05:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:07.522 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.522 --rc genhtml_branch_coverage=1 00:07:07.522 --rc genhtml_function_coverage=1 00:07:07.522 --rc genhtml_legend=1 00:07:07.522 --rc geninfo_all_blocks=1 00:07:07.522 --rc geninfo_unexecuted_blocks=1 00:07:07.522 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.522 ' 00:07:07.522 00:05:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:07.522 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.522 --rc genhtml_branch_coverage=1 00:07:07.522 --rc genhtml_function_coverage=1 00:07:07.522 --rc genhtml_legend=1 00:07:07.522 --rc geninfo_all_blocks=1 00:07:07.522 --rc geninfo_unexecuted_blocks=1 00:07:07.522 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.522 ' 00:07:07.522 00:05:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:07.522 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.522 --rc genhtml_branch_coverage=1 00:07:07.522 --rc genhtml_function_coverage=1 00:07:07.522 --rc genhtml_legend=1 00:07:07.522 --rc geninfo_all_blocks=1 00:07:07.522 --rc geninfo_unexecuted_blocks=1 00:07:07.522 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.522 ' 00:07:07.522 00:05:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:07.522 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.522 --rc genhtml_branch_coverage=1 00:07:07.522 --rc genhtml_function_coverage=1 00:07:07.522 --rc genhtml_legend=1 00:07:07.522 --rc geninfo_all_blocks=1 00:07:07.522 --rc geninfo_unexecuted_blocks=1 00:07:07.522 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.522 ' 00:07:07.522 00:05:32 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:07.522 00:05:32 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:07.522 00:05:32 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:07.522 00:05:32 -- common/autotest_common.sh@34 -- # set -e 00:07:07.522 00:05:32 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:07.523 00:05:32 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:07.523 00:05:32 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:07.523 00:05:32 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:07.523 00:05:32 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:07.523 00:05:32 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:07.523 00:05:32 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:07.523 00:05:32 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:07.523 00:05:32 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:07.523 00:05:32 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:07.523 00:05:32 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:07.523 00:05:32 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:07.523 00:05:32 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:07.523 00:05:32 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:07.523 00:05:32 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:07.523 00:05:32 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:07.523 00:05:32 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:07.523 00:05:32 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:07.523 00:05:32 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:07.523 00:05:32 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:07.523 00:05:32 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:07.523 00:05:32 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:07.523 00:05:32 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:07.523 00:05:32 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:07.523 00:05:32 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:07.523 00:05:32 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:07.523 00:05:32 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:07.523 00:05:32 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:07.523 00:05:32 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:07.523 00:05:32 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:07.523 00:05:32 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:07.523 00:05:32 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:07.523 00:05:32 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:07.523 00:05:32 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:07.523 00:05:32 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:07.523 00:05:32 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:07.523 00:05:32 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:07.523 00:05:32 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:07.523 00:05:32 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:07.523 00:05:32 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:07.523 00:05:32 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:07.523 00:05:32 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:07.523 00:05:32 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:07.523 00:05:32 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:07.523 00:05:32 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:07.523 00:05:32 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:07.523 00:05:32 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:07.523 00:05:32 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:07.523 00:05:32 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:07.523 00:05:32 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:07.523 00:05:32 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:07.523 00:05:32 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:07.523 00:05:32 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:07.523 00:05:32 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:07.523 00:05:32 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:07.523 00:05:32 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:07.523 00:05:32 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:07.523 00:05:32 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:07.523 00:05:32 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:07.523 00:05:32 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:07.523 00:05:32 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:07.523 00:05:32 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:07.523 00:05:32 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:07.523 00:05:32 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:07.523 00:05:32 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:07.523 00:05:32 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:07.523 00:05:32 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:07.523 00:05:32 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:07.523 00:05:32 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:07.523 00:05:32 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:07.523 00:05:32 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:07.523 00:05:32 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:07.523 00:05:32 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:07.523 00:05:32 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:07.523 00:05:32 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:07.523 00:05:32 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:07.523 00:05:32 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:07.523 00:05:32 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:07.523 00:05:32 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:07.523 00:05:32 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:07.523 00:05:32 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:07.523 00:05:32 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:07.523 00:05:32 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:07.523 00:05:32 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:07.523 00:05:32 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:07.523 00:05:32 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:07.523 00:05:32 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:07.523 00:05:32 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:07.523 00:05:32 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:07.523 00:05:32 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:07.523 00:05:32 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:07.523 00:05:32 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:07.523 00:05:32 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:07.523 #define SPDK_CONFIG_H 00:07:07.523 #define SPDK_CONFIG_APPS 1 00:07:07.523 #define SPDK_CONFIG_ARCH native 00:07:07.523 #undef SPDK_CONFIG_ASAN 00:07:07.523 #undef SPDK_CONFIG_AVAHI 00:07:07.523 #undef SPDK_CONFIG_CET 00:07:07.523 #define SPDK_CONFIG_COVERAGE 1 00:07:07.523 #define SPDK_CONFIG_CROSS_PREFIX 00:07:07.523 #undef SPDK_CONFIG_CRYPTO 00:07:07.523 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:07.523 #undef SPDK_CONFIG_CUSTOMOCF 00:07:07.523 #undef SPDK_CONFIG_DAOS 00:07:07.523 #define SPDK_CONFIG_DAOS_DIR 00:07:07.523 #define SPDK_CONFIG_DEBUG 1 00:07:07.523 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:07.523 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:07.523 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:07.523 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:07.523 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:07.523 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:07.523 #define SPDK_CONFIG_EXAMPLES 1 00:07:07.523 #undef SPDK_CONFIG_FC 00:07:07.523 #define SPDK_CONFIG_FC_PATH 00:07:07.523 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:07.523 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:07.523 #undef SPDK_CONFIG_FUSE 00:07:07.523 #define SPDK_CONFIG_FUZZER 1 00:07:07.523 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:07.523 #undef SPDK_CONFIG_GOLANG 00:07:07.523 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:07.523 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:07.523 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:07.523 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:07.523 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:07.523 #define SPDK_CONFIG_IDXD 1 00:07:07.523 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:07.523 #undef SPDK_CONFIG_IPSEC_MB 00:07:07.523 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:07.523 #define SPDK_CONFIG_ISAL 1 00:07:07.523 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:07.523 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:07.523 #define SPDK_CONFIG_LIBDIR 00:07:07.523 #undef SPDK_CONFIG_LTO 00:07:07.523 #define SPDK_CONFIG_MAX_LCORES 00:07:07.523 #define SPDK_CONFIG_NVME_CUSE 1 00:07:07.523 #undef SPDK_CONFIG_OCF 00:07:07.523 #define SPDK_CONFIG_OCF_PATH 00:07:07.523 #define SPDK_CONFIG_OPENSSL_PATH 00:07:07.523 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:07.523 #undef SPDK_CONFIG_PGO_USE 00:07:07.524 #define SPDK_CONFIG_PREFIX /usr/local 00:07:07.524 #undef SPDK_CONFIG_RAID5F 00:07:07.524 #undef SPDK_CONFIG_RBD 00:07:07.524 #define SPDK_CONFIG_RDMA 1 00:07:07.524 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:07.524 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:07.524 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:07.524 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:07.524 #undef SPDK_CONFIG_SHARED 00:07:07.524 #undef SPDK_CONFIG_SMA 00:07:07.524 #define SPDK_CONFIG_TESTS 1 00:07:07.524 #undef SPDK_CONFIG_TSAN 00:07:07.524 #define SPDK_CONFIG_UBLK 1 00:07:07.524 #define SPDK_CONFIG_UBSAN 1 00:07:07.524 #undef SPDK_CONFIG_UNIT_TESTS 00:07:07.524 #undef SPDK_CONFIG_URING 00:07:07.524 #define SPDK_CONFIG_URING_PATH 00:07:07.524 #undef SPDK_CONFIG_URING_ZNS 00:07:07.524 #undef SPDK_CONFIG_USDT 00:07:07.524 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:07.524 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:07.524 #define SPDK_CONFIG_VFIO_USER 1 00:07:07.524 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:07.524 #define SPDK_CONFIG_VHOST 1 00:07:07.524 #define SPDK_CONFIG_VIRTIO 1 00:07:07.524 #undef SPDK_CONFIG_VTUNE 00:07:07.524 #define SPDK_CONFIG_VTUNE_DIR 00:07:07.524 #define SPDK_CONFIG_WERROR 1 00:07:07.524 #define SPDK_CONFIG_WPDK_DIR 00:07:07.524 #undef SPDK_CONFIG_XNVME 00:07:07.524 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:07.524 00:05:32 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:07.524 00:05:32 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:07.524 00:05:32 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:07.524 00:05:32 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:07.524 00:05:32 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:07.524 00:05:32 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.524 00:05:32 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.524 00:05:32 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.524 00:05:32 -- paths/export.sh@5 -- # export PATH 00:07:07.524 00:05:32 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.524 00:05:32 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:07.524 00:05:32 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:07.524 00:05:32 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:07.524 00:05:32 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:07.524 00:05:32 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:07.524 00:05:32 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:07.524 00:05:32 -- pm/common@16 -- # TEST_TAG=N/A 00:07:07.524 00:05:32 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:07.524 00:05:32 -- common/autotest_common.sh@52 -- # : 1 00:07:07.524 00:05:32 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:07.524 00:05:32 -- common/autotest_common.sh@56 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:07.524 00:05:32 -- common/autotest_common.sh@58 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:07.524 00:05:32 -- common/autotest_common.sh@60 -- # : 1 00:07:07.524 00:05:32 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:07.524 00:05:32 -- common/autotest_common.sh@62 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:07.524 00:05:32 -- common/autotest_common.sh@64 -- # : 00:07:07.524 00:05:32 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:07.524 00:05:32 -- common/autotest_common.sh@66 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:07.524 00:05:32 -- common/autotest_common.sh@68 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:07.524 00:05:32 -- common/autotest_common.sh@70 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:07.524 00:05:32 -- common/autotest_common.sh@72 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:07.524 00:05:32 -- common/autotest_common.sh@74 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:07.524 00:05:32 -- common/autotest_common.sh@76 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:07.524 00:05:32 -- common/autotest_common.sh@78 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:07.524 00:05:32 -- common/autotest_common.sh@80 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:07.524 00:05:32 -- common/autotest_common.sh@82 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:07.524 00:05:32 -- common/autotest_common.sh@84 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:07.524 00:05:32 -- common/autotest_common.sh@86 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:07.524 00:05:32 -- common/autotest_common.sh@88 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:07.524 00:05:32 -- common/autotest_common.sh@90 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:07.524 00:05:32 -- common/autotest_common.sh@92 -- # : 1 00:07:07.524 00:05:32 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:07.524 00:05:32 -- common/autotest_common.sh@94 -- # : 1 00:07:07.524 00:05:32 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:07.524 00:05:32 -- common/autotest_common.sh@96 -- # : rdma 00:07:07.524 00:05:32 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:07.524 00:05:32 -- common/autotest_common.sh@98 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:07.524 00:05:32 -- common/autotest_common.sh@100 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:07.524 00:05:32 -- common/autotest_common.sh@102 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:07.524 00:05:32 -- common/autotest_common.sh@104 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:07.524 00:05:32 -- common/autotest_common.sh@106 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:07.524 00:05:32 -- common/autotest_common.sh@108 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:07.524 00:05:32 -- common/autotest_common.sh@110 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:07.524 00:05:32 -- common/autotest_common.sh@112 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:07.524 00:05:32 -- common/autotest_common.sh@114 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:07.524 00:05:32 -- common/autotest_common.sh@116 -- # : 1 00:07:07.524 00:05:32 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:07.524 00:05:32 -- common/autotest_common.sh@118 -- # : 00:07:07.524 00:05:32 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:07.524 00:05:32 -- common/autotest_common.sh@120 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:07.524 00:05:32 -- common/autotest_common.sh@122 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:07.524 00:05:32 -- common/autotest_common.sh@124 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:07.524 00:05:32 -- common/autotest_common.sh@126 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:07.524 00:05:32 -- common/autotest_common.sh@128 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:07.524 00:05:32 -- common/autotest_common.sh@130 -- # : 0 00:07:07.524 00:05:32 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:07.524 00:05:32 -- common/autotest_common.sh@132 -- # : 00:07:07.524 00:05:32 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:07.524 00:05:32 -- common/autotest_common.sh@134 -- # : true 00:07:07.524 00:05:32 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:07.524 00:05:32 -- common/autotest_common.sh@136 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:07.525 00:05:32 -- common/autotest_common.sh@138 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:07.525 00:05:32 -- common/autotest_common.sh@140 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:07.525 00:05:32 -- common/autotest_common.sh@142 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:07.525 00:05:32 -- common/autotest_common.sh@144 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:07.525 00:05:32 -- common/autotest_common.sh@146 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:07.525 00:05:32 -- common/autotest_common.sh@148 -- # : 00:07:07.525 00:05:32 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:07.525 00:05:32 -- common/autotest_common.sh@150 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:07.525 00:05:32 -- common/autotest_common.sh@152 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:07.525 00:05:32 -- common/autotest_common.sh@154 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:07.525 00:05:32 -- common/autotest_common.sh@156 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:07.525 00:05:32 -- common/autotest_common.sh@158 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:07.525 00:05:32 -- common/autotest_common.sh@160 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:07.525 00:05:32 -- common/autotest_common.sh@163 -- # : 00:07:07.525 00:05:32 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:07.525 00:05:32 -- common/autotest_common.sh@165 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:07.525 00:05:32 -- common/autotest_common.sh@167 -- # : 0 00:07:07.525 00:05:32 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:07.525 00:05:32 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:07.525 00:05:32 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:07.525 00:05:32 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:07.525 00:05:32 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:07.525 00:05:32 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:07.525 00:05:32 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:07.525 00:05:32 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:07.525 00:05:32 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:07.525 00:05:32 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:07.525 00:05:32 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:07.525 00:05:32 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:07.525 00:05:32 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:07.525 00:05:32 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:07.525 00:05:32 -- common/autotest_common.sh@196 -- # cat 00:07:07.525 00:05:32 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:07.525 00:05:32 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:07.525 00:05:32 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:07.525 00:05:32 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:07.525 00:05:32 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:07.525 00:05:32 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:07.525 00:05:32 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:07.525 00:05:32 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:07.525 00:05:32 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:07.525 00:05:32 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:07.525 00:05:32 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:07.525 00:05:32 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:07.525 00:05:32 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:07.525 00:05:32 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:07.525 00:05:32 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:07.525 00:05:32 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:07.525 00:05:32 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:07.525 00:05:32 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:07.525 00:05:32 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:07.525 00:05:32 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:07.525 00:05:32 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:07.525 00:05:32 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:07.525 00:05:32 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:07.525 00:05:32 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:07.525 00:05:32 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:07.525 00:05:32 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:07.525 00:05:32 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:07.525 00:05:32 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:07.525 00:05:32 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:07.525 00:05:32 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:07.525 00:05:32 -- common/autotest_common.sh@259 -- # valgrind= 00:07:07.525 00:05:32 -- common/autotest_common.sh@265 -- # uname -s 00:07:07.525 00:05:32 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:07.525 00:05:32 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:07.525 00:05:32 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:07.525 00:05:32 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:07.525 00:05:32 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:07.525 00:05:32 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:07.525 00:05:32 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:07.525 00:05:32 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:07.525 00:05:32 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:07.525 00:05:32 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:07.525 00:05:32 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:07.525 00:05:32 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:07.526 00:05:32 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:07.526 00:05:32 -- common/autotest_common.sh@319 -- # [[ -z 2720202 ]] 00:07:07.526 00:05:32 -- common/autotest_common.sh@319 -- # kill -0 2720202 00:07:07.526 00:05:32 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:07.526 00:05:32 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:07.526 00:05:32 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:07.526 00:05:32 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:07.526 00:05:32 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:07.526 00:05:33 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:07.526 00:05:33 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:07.526 00:05:33 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:07.526 00:05:33 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.FQ3yYz 00:07:07.526 00:05:33 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:07.526 00:05:33 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.FQ3yYz/tests/nvmf /tmp/spdk.FQ3yYz 00:07:07.526 00:05:33 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@328 -- # df -T 00:07:07.526 00:05:33 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=53338042368 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=8392564736 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863339520 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=1966080 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:07.526 00:05:33 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:07.526 00:05:33 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:07.526 00:05:33 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:07.526 00:05:33 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:07.526 * Looking for test storage... 00:07:07.526 00:05:33 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:07.526 00:05:33 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:07.526 00:05:33 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.526 00:05:33 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:07.526 00:05:33 -- common/autotest_common.sh@373 -- # mount=/ 00:07:07.526 00:05:33 -- common/autotest_common.sh@375 -- # target_space=53338042368 00:07:07.526 00:05:33 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:07.526 00:05:33 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:07.526 00:05:33 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@382 -- # new_size=10607157248 00:07:07.526 00:05:33 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:07.526 00:05:33 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.526 00:05:33 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.526 00:05:33 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.526 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.526 00:05:33 -- common/autotest_common.sh@390 -- # return 0 00:07:07.526 00:05:33 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:07.526 00:05:33 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:07.526 00:05:33 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:07.526 00:05:33 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:07.526 00:05:33 -- common/autotest_common.sh@1682 -- # true 00:07:07.526 00:05:33 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:07.526 00:05:33 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@27 -- # exec 00:07:07.526 00:05:33 -- common/autotest_common.sh@29 -- # exec 00:07:07.526 00:05:33 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:07.526 00:05:33 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:07.526 00:05:33 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:07.526 00:05:33 -- common/autotest_common.sh@18 -- # set -x 00:07:07.526 00:05:33 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:07.526 00:05:33 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:07.526 00:05:33 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:07.787 00:05:33 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:07.787 00:05:33 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:07.787 00:05:33 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:07.787 00:05:33 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:07.787 00:05:33 -- scripts/common.sh@335 -- # IFS=.-: 00:07:07.787 00:05:33 -- scripts/common.sh@335 -- # read -ra ver1 00:07:07.787 00:05:33 -- scripts/common.sh@336 -- # IFS=.-: 00:07:07.787 00:05:33 -- scripts/common.sh@336 -- # read -ra ver2 00:07:07.787 00:05:33 -- scripts/common.sh@337 -- # local 'op=<' 00:07:07.787 00:05:33 -- scripts/common.sh@339 -- # ver1_l=2 00:07:07.787 00:05:33 -- scripts/common.sh@340 -- # ver2_l=1 00:07:07.787 00:05:33 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:07.787 00:05:33 -- scripts/common.sh@343 -- # case "$op" in 00:07:07.787 00:05:33 -- scripts/common.sh@344 -- # : 1 00:07:07.787 00:05:33 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:07.787 00:05:33 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:07.787 00:05:33 -- scripts/common.sh@364 -- # decimal 1 00:07:07.787 00:05:33 -- scripts/common.sh@352 -- # local d=1 00:07:07.787 00:05:33 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:07.787 00:05:33 -- scripts/common.sh@354 -- # echo 1 00:07:07.787 00:05:33 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:07.787 00:05:33 -- scripts/common.sh@365 -- # decimal 2 00:07:07.787 00:05:33 -- scripts/common.sh@352 -- # local d=2 00:07:07.787 00:05:33 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:07.787 00:05:33 -- scripts/common.sh@354 -- # echo 2 00:07:07.787 00:05:33 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:07.787 00:05:33 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:07.787 00:05:33 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:07.787 00:05:33 -- scripts/common.sh@367 -- # return 0 00:07:07.787 00:05:33 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:07.787 00:05:33 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:07.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.787 --rc genhtml_branch_coverage=1 00:07:07.787 --rc genhtml_function_coverage=1 00:07:07.787 --rc genhtml_legend=1 00:07:07.787 --rc geninfo_all_blocks=1 00:07:07.787 --rc geninfo_unexecuted_blocks=1 00:07:07.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.787 ' 00:07:07.787 00:05:33 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:07.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.787 --rc genhtml_branch_coverage=1 00:07:07.787 --rc genhtml_function_coverage=1 00:07:07.787 --rc genhtml_legend=1 00:07:07.787 --rc geninfo_all_blocks=1 00:07:07.787 --rc geninfo_unexecuted_blocks=1 00:07:07.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.787 ' 00:07:07.787 00:05:33 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:07.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.787 --rc genhtml_branch_coverage=1 00:07:07.787 --rc genhtml_function_coverage=1 00:07:07.787 --rc genhtml_legend=1 00:07:07.787 --rc geninfo_all_blocks=1 00:07:07.787 --rc geninfo_unexecuted_blocks=1 00:07:07.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.787 ' 00:07:07.787 00:05:33 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:07.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.787 --rc genhtml_branch_coverage=1 00:07:07.787 --rc genhtml_function_coverage=1 00:07:07.787 --rc genhtml_legend=1 00:07:07.787 --rc geninfo_all_blocks=1 00:07:07.787 --rc geninfo_unexecuted_blocks=1 00:07:07.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.787 ' 00:07:07.787 00:05:33 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:07.787 00:05:33 -- ../common.sh@8 -- # pids=() 00:07:07.787 00:05:33 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:07.787 00:05:33 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:07.787 00:05:33 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:07.787 00:05:33 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:07.787 00:05:33 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:07.787 00:05:33 -- nvmf/run.sh@61 -- # mem_size=512 00:07:07.787 00:05:33 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:07.787 00:05:33 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:07.787 00:05:33 -- ../common.sh@69 -- # local fuzz_num=25 00:07:07.787 00:05:33 -- ../common.sh@70 -- # local time=1 00:07:07.787 00:05:33 -- ../common.sh@72 -- # (( i = 0 )) 00:07:07.787 00:05:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:07.787 00:05:33 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:07.787 00:05:33 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:07.787 00:05:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:07.787 00:05:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:07.787 00:05:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:07.787 00:05:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:07.787 00:05:33 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:07.787 00:05:33 -- nvmf/run.sh@29 -- # port=4400 00:07:07.787 00:05:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:07.787 00:05:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:07.787 00:05:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:07.787 00:05:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:07.787 [2024-11-30 00:05:33.198532] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:07.787 [2024-11-30 00:05:33.198640] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2720311 ] 00:07:07.787 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.046 [2024-11-30 00:05:33.462380] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.046 [2024-11-30 00:05:33.546563] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:08.046 [2024-11-30 00:05:33.546734] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.305 [2024-11-30 00:05:33.605120] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:08.305 [2024-11-30 00:05:33.621494] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:08.305 INFO: Running with entropic power schedule (0xFF, 100). 00:07:08.305 INFO: Seed: 338071910 00:07:08.305 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:08.305 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:08.305 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:08.305 INFO: A corpus is not provided, starting from an empty corpus 00:07:08.305 #2 INITED exec/s: 0 rss: 60Mb 00:07:08.305 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:08.305 This may also happen if the target rejected all inputs we tried so far 00:07:08.305 [2024-11-30 00:05:33.666580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.305 [2024-11-30 00:05:33.666616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.565 NEW_FUNC[1/671]: 0x43a858 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:08.565 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:08.565 #4 NEW cov: 11558 ft: 11547 corp: 2/116b lim: 320 exec/s: 0 rss: 69Mb L: 115/115 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:08.565 [2024-11-30 00:05:34.008388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.565 [2024-11-30 00:05:34.008454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.565 #5 NEW cov: 11671 ft: 12311 corp: 3/192b lim: 320 exec/s: 0 rss: 69Mb L: 76/115 MS: 1 EraseBytes- 00:07:08.565 [2024-11-30 00:05:34.058287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.565 [2024-11-30 00:05:34.058319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.565 #6 NEW cov: 11677 ft: 12558 corp: 4/307b lim: 320 exec/s: 0 rss: 69Mb L: 115/115 MS: 1 CopyPart- 00:07:08.565 [2024-11-30 00:05:34.098286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:08.565 [2024-11-30 00:05:34.098317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.824 NEW_FUNC[1/1]: 0x12c5228 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2016 00:07:08.824 #8 NEW cov: 11794 ft: 13073 corp: 5/410b lim: 320 exec/s: 0 rss: 69Mb L: 103/115 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:08.824 [2024-11-30 00:05:34.138351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff41ffff 00:07:08.824 [2024-11-30 00:05:34.138378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.824 #9 NEW cov: 11794 ft: 13167 corp: 6/513b lim: 320 exec/s: 0 rss: 69Mb L: 103/115 MS: 1 ChangeByte- 00:07:08.824 [2024-11-30 00:05:34.188740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.824 [2024-11-30 00:05:34.188768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.824 #10 NEW cov: 11794 ft: 13218 corp: 7/628b lim: 320 exec/s: 0 rss: 69Mb L: 115/115 MS: 1 CopyPart- 00:07:08.824 [2024-11-30 00:05:34.228832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.824 [2024-11-30 00:05:34.228859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.824 #11 NEW cov: 11794 ft: 13282 corp: 8/743b lim: 320 exec/s: 0 rss: 69Mb L: 115/115 MS: 1 ShuffleBytes- 00:07:08.824 [2024-11-30 00:05:34.268722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x201ff41ffff 00:07:08.824 [2024-11-30 00:05:34.268751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.824 #12 NEW cov: 11794 ft: 13313 corp: 9/846b lim: 320 exec/s: 0 rss: 69Mb L: 103/115 MS: 1 CMP- DE: "\001\002\000\000"- 00:07:08.824 [2024-11-30 00:05:34.318994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:08.824 [2024-11-30 00:05:34.319021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.824 #13 NEW cov: 11794 ft: 13323 corp: 10/941b lim: 320 exec/s: 0 rss: 69Mb L: 95/115 MS: 1 EraseBytes- 00:07:08.824 [2024-11-30 00:05:34.359154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.824 [2024-11-30 00:05:34.359182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.084 #14 NEW cov: 11794 ft: 13384 corp: 11/1056b lim: 320 exec/s: 0 rss: 69Mb L: 115/115 MS: 1 ShuffleBytes- 00:07:09.084 [2024-11-30 00:05:34.399349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.084 [2024-11-30 00:05:34.399377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.084 [2024-11-30 00:05:34.399500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.084 [2024-11-30 00:05:34.399518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.084 #15 NEW cov: 11794 ft: 13599 corp: 12/1243b lim: 320 exec/s: 0 rss: 69Mb L: 187/187 MS: 1 CrossOver- 00:07:09.084 [2024-11-30 00:05:34.439502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.084 [2024-11-30 00:05:34.439530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.084 [2024-11-30 00:05:34.439627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.084 [2024-11-30 00:05:34.439647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.084 #16 NEW cov: 11811 ft: 13674 corp: 13/1372b lim: 320 exec/s: 0 rss: 69Mb L: 129/187 MS: 1 InsertRepeatedBytes- 00:07:09.084 [2024-11-30 00:05:34.479544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.084 [2024-11-30 00:05:34.479571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.084 #17 NEW cov: 11811 ft: 13712 corp: 14/1487b lim: 320 exec/s: 0 rss: 69Mb L: 115/187 MS: 1 CopyPart- 00:07:09.084 [2024-11-30 00:05:34.519722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.084 [2024-11-30 00:05:34.519750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.084 [2024-11-30 00:05:34.519854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.085 [2024-11-30 00:05:34.519885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.085 #18 NEW cov: 11811 ft: 13764 corp: 15/1657b lim: 320 exec/s: 0 rss: 69Mb L: 170/187 MS: 1 CrossOver- 00:07:09.085 [2024-11-30 00:05:34.559868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.085 [2024-11-30 00:05:34.559896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.085 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:09.085 #19 NEW cov: 11834 ft: 13843 corp: 16/1772b lim: 320 exec/s: 0 rss: 70Mb L: 115/187 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:07:09.085 [2024-11-30 00:05:34.609873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.085 [2024-11-30 00:05:34.609901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.085 #20 NEW cov: 11834 ft: 13859 corp: 17/1882b lim: 320 exec/s: 0 rss: 70Mb L: 110/187 MS: 1 EraseBytes- 00:07:09.345 [2024-11-30 00:05:34.649675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.345 [2024-11-30 00:05:34.649703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.345 [2024-11-30 00:05:34.649803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.345 [2024-11-30 00:05:34.649822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.345 #21 NEW cov: 11834 ft: 13882 corp: 18/2052b lim: 320 exec/s: 21 rss: 70Mb L: 170/187 MS: 1 ShuffleBytes- 00:07:09.345 [2024-11-30 00:05:34.690147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.345 [2024-11-30 00:05:34.690173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.345 #22 NEW cov: 11834 ft: 13895 corp: 19/2167b lim: 320 exec/s: 22 rss: 70Mb L: 115/187 MS: 1 ChangeByte- 00:07:09.345 [2024-11-30 00:05:34.730280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.345 [2024-11-30 00:05:34.730307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.345 #23 NEW cov: 11834 ft: 13907 corp: 20/2283b lim: 320 exec/s: 23 rss: 70Mb L: 116/187 MS: 1 CrossOver- 00:07:09.345 [2024-11-30 00:05:34.770660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.345 [2024-11-30 00:05:34.770689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.345 [2024-11-30 00:05:34.770795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.345 [2024-11-30 00:05:34.770812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.345 #24 NEW cov: 11834 ft: 13940 corp: 21/2450b lim: 320 exec/s: 24 rss: 70Mb L: 167/187 MS: 1 InsertRepeatedBytes- 00:07:09.345 [2024-11-30 00:05:34.810725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.345 [2024-11-30 00:05:34.810752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.345 [2024-11-30 00:05:34.810860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.345 [2024-11-30 00:05:34.810877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.345 #25 NEW cov: 11834 ft: 13957 corp: 22/2617b lim: 320 exec/s: 25 rss: 70Mb L: 167/187 MS: 1 ShuffleBytes- 00:07:09.345 [2024-11-30 00:05:34.860724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.345 [2024-11-30 00:05:34.860751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.345 [2024-11-30 00:05:34.860849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.345 [2024-11-30 00:05:34.860865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.345 #26 NEW cov: 11834 ft: 14047 corp: 23/2804b lim: 320 exec/s: 26 rss: 70Mb L: 187/187 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:09.604 [2024-11-30 00:05:34.900908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.605 [2024-11-30 00:05:34.900936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.605 #27 NEW cov: 11834 ft: 14136 corp: 24/2919b lim: 320 exec/s: 27 rss: 70Mb L: 115/187 MS: 1 ShuffleBytes- 00:07:09.605 [2024-11-30 00:05:34.940992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.605 [2024-11-30 00:05:34.941018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.605 [2024-11-30 00:05:34.941111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.605 [2024-11-30 00:05:34.941127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.605 #28 NEW cov: 11834 ft: 14138 corp: 25/3090b lim: 320 exec/s: 28 rss: 70Mb L: 171/187 MS: 1 InsertByte- 00:07:09.605 [2024-11-30 00:05:34.981212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.605 [2024-11-30 00:05:34.981240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.605 [2024-11-30 00:05:34.981347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.605 [2024-11-30 00:05:34.981363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.605 #29 NEW cov: 11834 ft: 14208 corp: 26/3219b lim: 320 exec/s: 29 rss: 70Mb L: 129/187 MS: 1 ShuffleBytes- 00:07:09.605 [2024-11-30 00:05:35.021256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.605 [2024-11-30 00:05:35.021282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.605 [2024-11-30 00:05:35.021376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.605 [2024-11-30 00:05:35.021393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.605 #30 NEW cov: 11834 ft: 14267 corp: 27/3406b lim: 320 exec/s: 30 rss: 70Mb L: 187/187 MS: 1 ChangeByte- 00:07:09.605 [2024-11-30 00:05:35.061338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.605 [2024-11-30 00:05:35.061365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.605 [2024-11-30 00:05:35.061460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.605 [2024-11-30 00:05:35.061478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.605 #31 NEW cov: 11834 ft: 14272 corp: 28/3576b lim: 320 exec/s: 31 rss: 70Mb L: 170/187 MS: 1 ChangeBit- 00:07:09.605 [2024-11-30 00:05:35.101560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.605 [2024-11-30 00:05:35.101586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.605 [2024-11-30 00:05:35.101667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.605 [2024-11-30 00:05:35.101686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.605 #34 NEW cov: 11834 ft: 14293 corp: 29/3761b lim: 320 exec/s: 34 rss: 70Mb L: 185/187 MS: 3 EraseBytes-ChangeByte-CrossOver- 00:07:09.605 [2024-11-30 00:05:35.141392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffff2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x201ff41ffff 00:07:09.605 [2024-11-30 00:05:35.141419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 #35 NEW cov: 11834 ft: 14320 corp: 30/3865b lim: 320 exec/s: 35 rss: 70Mb L: 104/187 MS: 1 InsertByte- 00:07:09.868 [2024-11-30 00:05:35.182095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.868 [2024-11-30 00:05:35.182122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.182235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.182251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.182375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:6 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.868 [2024-11-30 00:05:35.182392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.868 #36 NEW cov: 11834 ft: 14949 corp: 31/4076b lim: 320 exec/s: 36 rss: 70Mb L: 211/211 MS: 1 InsertRepeatedBytes- 00:07:09.868 [2024-11-30 00:05:35.221969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.868 [2024-11-30 00:05:35.221995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.222102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.222120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.222255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.868 [2024-11-30 00:05:35.222271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.868 #37 NEW cov: 11834 ft: 14981 corp: 32/4284b lim: 320 exec/s: 37 rss: 70Mb L: 208/211 MS: 1 InsertRepeatedBytes- 00:07:09.868 [2024-11-30 00:05:35.261964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.261991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.262095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.868 [2024-11-30 00:05:35.262111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.868 #38 NEW cov: 11834 ft: 14986 corp: 33/4475b lim: 320 exec/s: 38 rss: 70Mb L: 191/211 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:07:09.868 [2024-11-30 00:05:35.301989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.302021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.302114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.868 [2024-11-30 00:05:35.302130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.868 #39 NEW cov: 11834 ft: 14987 corp: 34/4646b lim: 320 exec/s: 39 rss: 70Mb L: 171/211 MS: 1 ChangeBinInt- 00:07:09.868 [2024-11-30 00:05:35.342111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.342137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.342233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.868 [2024-11-30 00:05:35.342249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.868 #40 NEW cov: 11834 ft: 14994 corp: 35/4818b lim: 320 exec/s: 40 rss: 70Mb L: 172/211 MS: 1 CMP- DE: "\001\001"- 00:07:09.868 [2024-11-30 00:05:35.381828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.381854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.381954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:09.868 [2024-11-30 00:05:35.381971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.868 #41 NEW cov: 11834 ft: 15003 corp: 36/5005b lim: 320 exec/s: 41 rss: 70Mb L: 187/211 MS: 1 ChangeBit- 00:07:09.868 [2024-11-30 00:05:35.422423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:09.868 [2024-11-30 00:05:35.422451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.868 [2024-11-30 00:05:35.422559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff41ffffffffff 00:07:09.868 [2024-11-30 00:05:35.422577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.127 #42 NEW cov: 11834 ft: 15050 corp: 37/5175b lim: 320 exec/s: 42 rss: 70Mb L: 170/211 MS: 1 ChangeByte- 00:07:10.127 [2024-11-30 00:05:35.462327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:10.127 [2024-11-30 00:05:35.462357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.127 #43 NEW cov: 11834 ft: 15055 corp: 38/5270b lim: 320 exec/s: 43 rss: 70Mb L: 95/211 MS: 1 ShuffleBytes- 00:07:10.127 [2024-11-30 00:05:35.512799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:10.127 [2024-11-30 00:05:35.512827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.127 [2024-11-30 00:05:35.512953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:65656565 cdw11:65656565 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6565656565656565 00:07:10.127 [2024-11-30 00:05:35.512970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.127 #44 NEW cov: 11834 ft: 15086 corp: 39/5457b lim: 320 exec/s: 44 rss: 70Mb L: 187/211 MS: 1 ChangeByte- 00:07:10.127 [2024-11-30 00:05:35.552358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.127 [2024-11-30 00:05:35.552387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.127 #45 NEW cov: 11834 ft: 15107 corp: 40/5572b lim: 320 exec/s: 45 rss: 70Mb L: 115/211 MS: 1 ChangeBinInt- 00:07:10.127 [2024-11-30 00:05:35.592473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (65) qid:0 cid:4 nsid:65656565 cdw10:65656565 cdw11:65656565 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.127 [2024-11-30 00:05:35.592502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.127 #46 NEW cov: 11834 ft: 15108 corp: 41/5692b lim: 320 exec/s: 46 rss: 70Mb L: 120/211 MS: 1 InsertRepeatedBytes- 00:07:10.127 [2024-11-30 00:05:35.632471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:10.127 [2024-11-30 00:05:35.632499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.127 #47 NEW cov: 11834 ft: 15116 corp: 42/5774b lim: 320 exec/s: 23 rss: 70Mb L: 82/211 MS: 1 EraseBytes- 00:07:10.127 #47 DONE cov: 11834 ft: 15116 corp: 42/5774b lim: 320 exec/s: 23 rss: 70Mb 00:07:10.127 ###### Recommended dictionary. ###### 00:07:10.127 "\001\002\000\000" # Uses: 2 00:07:10.127 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:10.127 "\001\001" # Uses: 0 00:07:10.127 ###### End of recommended dictionary. ###### 00:07:10.127 Done 47 runs in 2 second(s) 00:07:10.386 00:05:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:10.386 00:05:35 -- ../common.sh@72 -- # (( i++ )) 00:07:10.386 00:05:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:10.386 00:05:35 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:10.386 00:05:35 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:10.386 00:05:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:10.386 00:05:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:10.386 00:05:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:10.386 00:05:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:10.386 00:05:35 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:10.386 00:05:35 -- nvmf/run.sh@29 -- # port=4401 00:07:10.386 00:05:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:10.386 00:05:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:10.386 00:05:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:10.386 00:05:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:10.386 [2024-11-30 00:05:35.828659] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:10.386 [2024-11-30 00:05:35.828723] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2720781 ] 00:07:10.386 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.645 [2024-11-30 00:05:36.078733] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.645 [2024-11-30 00:05:36.162852] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:10.645 [2024-11-30 00:05:36.163016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.903 [2024-11-30 00:05:36.221066] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:10.903 [2024-11-30 00:05:36.237437] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:10.903 INFO: Running with entropic power schedule (0xFF, 100). 00:07:10.903 INFO: Seed: 2955068440 00:07:10.903 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:10.903 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:10.903 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:10.903 INFO: A corpus is not provided, starting from an empty corpus 00:07:10.903 #2 INITED exec/s: 0 rss: 60Mb 00:07:10.903 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:10.903 This may also happen if the target rejected all inputs we tried so far 00:07:10.903 [2024-11-30 00:05:36.281899] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:10.903 [2024-11-30 00:05:36.282094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.903 [2024-11-30 00:05:36.282121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.903 [2024-11-30 00:05:36.282153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.903 [2024-11-30 00:05:36.282169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.163 NEW_FUNC[1/671]: 0x43b158 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:11.163 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:11.163 #4 NEW cov: 11656 ft: 11644 corp: 2/15b lim: 30 exec/s: 0 rss: 68Mb L: 14/14 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:11.163 [2024-11-30 00:05:36.602650] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.163 [2024-11-30 00:05:36.602822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.163 [2024-11-30 00:05:36.602851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.163 [2024-11-30 00:05:36.602882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.163 [2024-11-30 00:05:36.602897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.163 #5 NEW cov: 11769 ft: 12149 corp: 3/29b lim: 30 exec/s: 0 rss: 69Mb L: 14/14 MS: 1 CrossOver- 00:07:11.163 [2024-11-30 00:05:36.672774] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.163 [2024-11-30 00:05:36.672942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a00002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.163 [2024-11-30 00:05:36.672967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.163 [2024-11-30 00:05:36.672999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.163 [2024-11-30 00:05:36.673015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.163 #6 NEW cov: 11775 ft: 12405 corp: 4/44b lim: 30 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 InsertByte- 00:07:11.423 [2024-11-30 00:05:36.722915] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:11.423 [2024-11-30 00:05:36.722988] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:11.423 [2024-11-30 00:05:36.723048] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:11.423 [2024-11-30 00:05:36.723105] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:11.423 [2024-11-30 00:05:36.723218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.723240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.723271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.723287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.723315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.723331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.723359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.723374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.423 #13 NEW cov: 11866 ft: 13183 corp: 5/72b lim: 30 exec/s: 0 rss: 69Mb L: 28/28 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:11.423 [2024-11-30 00:05:36.783039] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.423 [2024-11-30 00:05:36.783203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.783228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.783260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000bc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.783276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.423 #14 NEW cov: 11866 ft: 13283 corp: 6/87b lim: 30 exec/s: 0 rss: 69Mb L: 15/28 MS: 1 InsertByte- 00:07:11.423 [2024-11-30 00:05:36.833176] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:11.423 [2024-11-30 00:05:36.833338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.833362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.833393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.833408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.423 #16 NEW cov: 11866 ft: 13406 corp: 7/104b lim: 30 exec/s: 0 rss: 69Mb L: 17/28 MS: 2 CrossOver-CrossOver- 00:07:11.423 [2024-11-30 00:05:36.893350] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.423 [2024-11-30 00:05:36.893616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.893646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.893678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.893695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.893723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0000003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.893739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.423 [2024-11-30 00:05:36.893768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.423 [2024-11-30 00:05:36.893783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.424 #17 NEW cov: 11866 ft: 13533 corp: 8/132b lim: 30 exec/s: 0 rss: 69Mb L: 28/28 MS: 1 CopyPart- 00:07:11.424 [2024-11-30 00:05:36.963524] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.424 [2024-11-30 00:05:36.963702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.424 [2024-11-30 00:05:36.963728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.424 [2024-11-30 00:05:36.963760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000bc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.424 [2024-11-30 00:05:36.963776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.683 #18 NEW cov: 11866 ft: 13630 corp: 9/145b lim: 30 exec/s: 0 rss: 69Mb L: 13/28 MS: 1 EraseBytes- 00:07:11.683 [2024-11-30 00:05:37.033695] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:11.683 [2024-11-30 00:05:37.033854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.033878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.683 [2024-11-30 00:05:37.033907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.033922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.683 #19 NEW cov: 11866 ft: 13660 corp: 10/162b lim: 30 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 CrossOver- 00:07:11.683 [2024-11-30 00:05:37.093853] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.683 [2024-11-30 00:05:37.094010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a00002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.094034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.683 [2024-11-30 00:05:37.094063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.094078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.683 #20 NEW cov: 11866 ft: 13727 corp: 11/177b lim: 30 exec/s: 0 rss: 69Mb L: 15/28 MS: 1 CopyPart- 00:07:11.683 [2024-11-30 00:05:37.143989] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d2d2 00:07:11.683 [2024-11-30 00:05:37.144059] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d2d2 00:07:11.683 [2024-11-30 00:05:37.144182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8f02d2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.144204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.683 [2024-11-30 00:05:37.144235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d2d202d2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.144252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.683 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:11.683 #22 NEW cov: 11883 ft: 13765 corp: 12/189b lim: 30 exec/s: 0 rss: 69Mb L: 12/28 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:11.683 [2024-11-30 00:05:37.194104] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.683 [2024-11-30 00:05:37.194262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.194286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.683 [2024-11-30 00:05:37.194316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.683 [2024-11-30 00:05:37.194332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.683 #23 NEW cov: 11883 ft: 13783 corp: 13/203b lim: 30 exec/s: 0 rss: 69Mb L: 14/28 MS: 1 CrossOver- 00:07:11.942 [2024-11-30 00:05:37.244268] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.943 [2024-11-30 00:05:37.244339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008e8e 00:07:11.943 [2024-11-30 00:05:37.244395] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008e8e 00:07:11.943 [2024-11-30 00:05:37.244449] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (145980) > buf size (4096) 00:07:11.943 [2024-11-30 00:05:37.244550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a00002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.244571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.244607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8e8e028e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.244623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.244649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8e8e028e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.244664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.244690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8e8e0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.244706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.943 #24 NEW cov: 11883 ft: 13822 corp: 14/232b lim: 30 exec/s: 24 rss: 69Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:11.943 [2024-11-30 00:05:37.314472] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.943 [2024-11-30 00:05:37.314550] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:11.943 [2024-11-30 00:05:37.314671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a00002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.314694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.314725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.314740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.943 #25 NEW cov: 11883 ft: 13933 corp: 15/247b lim: 30 exec/s: 25 rss: 69Mb L: 15/29 MS: 1 ChangeBinInt- 00:07:11.943 [2024-11-30 00:05:37.364592] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aaaa 00:07:11.943 [2024-11-30 00:05:37.364720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:269402aa cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.364745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.943 #29 NEW cov: 11883 ft: 14309 corp: 16/258b lim: 30 exec/s: 29 rss: 69Mb L: 11/29 MS: 4 ShuffleBytes-ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:11.943 [2024-11-30 00:05:37.414751] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:11.943 [2024-11-30 00:05:37.414867] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (49920) > len (4) 00:07:11.943 [2024-11-30 00:05:37.414996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.415019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.415052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.415068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.415097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.415113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.943 #30 NEW cov: 11889 ft: 14584 corp: 17/276b lim: 30 exec/s: 30 rss: 70Mb L: 18/29 MS: 1 InsertByte- 00:07:11.943 [2024-11-30 00:05:37.484902] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:11.943 [2024-11-30 00:05:37.485148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.485172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.485202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.485217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.485243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0000003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.485258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.943 [2024-11-30 00:05:37.485308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.943 [2024-11-30 00:05:37.485323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.203 #31 NEW cov: 11889 ft: 14621 corp: 18/304b lim: 30 exec/s: 31 rss: 70Mb L: 28/29 MS: 1 ShuffleBytes- 00:07:12.203 [2024-11-30 00:05:37.555083] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200006969 00:07:12.203 [2024-11-30 00:05:37.555152] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:12.203 [2024-11-30 00:05:37.555207] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:12.203 [2024-11-30 00:05:37.555261] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006969 00:07:12.203 [2024-11-30 00:05:37.555363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:69690269 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.555383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.555413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.555427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.555454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.555469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.555495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:69698169 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.555509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.203 #32 NEW cov: 11889 ft: 14645 corp: 19/333b lim: 30 exec/s: 32 rss: 70Mb L: 29/29 MS: 1 CrossOver- 00:07:12.203 [2024-11-30 00:05:37.625258] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d2d2 00:07:12.203 [2024-11-30 00:05:37.625327] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd2d2 00:07:12.203 [2024-11-30 00:05:37.625434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8f02d2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.625455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.625485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.625499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.203 #33 NEW cov: 11889 ft: 14656 corp: 20/349b lim: 30 exec/s: 33 rss: 70Mb L: 16/29 MS: 1 CMP- DE: "\004\000\000\000"- 00:07:12.203 [2024-11-30 00:05:37.685418] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.203 [2024-11-30 00:05:37.685544] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:12.203 [2024-11-30 00:05:37.685663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.685688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.685720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.685741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.685770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.685786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.203 #34 NEW cov: 11889 ft: 14743 corp: 21/369b lim: 30 exec/s: 34 rss: 70Mb L: 20/29 MS: 1 InsertRepeatedBytes- 00:07:12.203 [2024-11-30 00:05:37.735579] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:12.203 [2024-11-30 00:05:37.735659] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.203 [2024-11-30 00:05:37.735855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.735877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.735909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.735924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.735952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.735967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.203 [2024-11-30 00:05:37.735995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.203 [2024-11-30 00:05:37.736010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.463 #35 NEW cov: 11889 ft: 14769 corp: 22/398b lim: 30 exec/s: 35 rss: 70Mb L: 29/29 MS: 1 CrossOver- 00:07:12.463 [2024-11-30 00:05:37.786248] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:37.786648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.786676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.786743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000078 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.786760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.786827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.786843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.463 #36 NEW cov: 11889 ft: 14886 corp: 23/416b lim: 30 exec/s: 36 rss: 70Mb L: 18/29 MS: 1 InsertByte- 00:07:12.463 [2024-11-30 00:05:37.826329] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:37.826625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a0000d5 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.826651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.826789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.826873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.463 #37 NEW cov: 11889 ft: 14997 corp: 24/431b lim: 30 exec/s: 37 rss: 70Mb L: 15/29 MS: 1 ChangeBinInt- 00:07:12.463 [2024-11-30 00:05:37.886514] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (321772) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:37.886724] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:12.463 [2024-11-30 00:05:37.886929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a813a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.886954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.887008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.887023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.887078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.887091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.463 #38 NEW cov: 11889 ft: 15015 corp: 25/451b lim: 30 exec/s: 38 rss: 70Mb L: 20/29 MS: 1 ChangeByte- 00:07:12.463 [2024-11-30 00:05:37.926650] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:37.926765] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (148) > len (4) 00:07:12.463 [2024-11-30 00:05:37.926868] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000062fa 00:07:12.463 [2024-11-30 00:05:37.926970] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:12.463 [2024-11-30 00:05:37.927181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.927206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.927261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.927275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.927328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:46ed025f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.927342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.927393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.927406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.463 #39 NEW cov: 11889 ft: 15031 corp: 26/479b lim: 30 exec/s: 39 rss: 70Mb L: 28/29 MS: 1 CMP- DE: "\000\224F\355_zb\372"- 00:07:12.463 [2024-11-30 00:05:37.966717] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:37.967051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.967083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:37.967138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000078 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:37.967152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.463 #40 NEW cov: 11889 ft: 15048 corp: 27/492b lim: 30 exec/s: 40 rss: 70Mb L: 13/29 MS: 1 EraseBytes- 00:07:12.463 [2024-11-30 00:05:38.006902] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:38.007016] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.463 [2024-11-30 00:05:38.007410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:38.007434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:38.007489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a3a003b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:38.007504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:38.007559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:38.007572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.463 [2024-11-30 00:05:38.007625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.463 [2024-11-30 00:05:38.007639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.722 #41 NEW cov: 11889 ft: 15073 corp: 28/521b lim: 30 exec/s: 41 rss: 70Mb L: 29/29 MS: 1 ChangeBit- 00:07:12.722 [2024-11-30 00:05:38.047005] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:12.722 [2024-11-30 00:05:38.047501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.722 [2024-11-30 00:05:38.047526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.722 [2024-11-30 00:05:38.047578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.722 [2024-11-30 00:05:38.047592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.722 [2024-11-30 00:05:38.047651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.722 [2024-11-30 00:05:38.047664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.722 [2024-11-30 00:05:38.047716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.722 [2024-11-30 00:05:38.047729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.722 #42 NEW cov: 11889 ft: 15084 corp: 29/545b lim: 30 exec/s: 42 rss: 70Mb L: 24/29 MS: 1 CopyPart- 00:07:12.722 [2024-11-30 00:05:38.087086] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:12.722 [2024-11-30 00:05:38.087296] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001e49 00:07:12.723 [2024-11-30 00:05:38.087499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.087524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.087579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000bc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.087593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.087655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.087669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.723 #43 NEW cov: 11889 ft: 15085 corp: 30/568b lim: 30 exec/s: 43 rss: 70Mb L: 23/29 MS: 1 CMP- DE: "\000\000\000\000\002\036IM"- 00:07:12.723 [2024-11-30 00:05:38.127158] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000004d2 00:07:12.723 [2024-11-30 00:05:38.127268] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd2d2 00:07:12.723 [2024-11-30 00:05:38.127467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8f02d2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.127491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.127545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d2000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.127559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.723 #44 NEW cov: 11889 ft: 15117 corp: 31/584b lim: 30 exec/s: 44 rss: 70Mb L: 16/29 MS: 1 ShuffleBytes- 00:07:12.723 [2024-11-30 00:05:38.167303] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59628) > buf size (4096) 00:07:12.723 [2024-11-30 00:05:38.167696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.167720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.167774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.167789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.167843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00040000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.167856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.723 #45 NEW cov: 11896 ft: 15146 corp: 32/606b lim: 30 exec/s: 45 rss: 70Mb L: 22/29 MS: 1 PersAutoDict- DE: "\004\000\000\000"- 00:07:12.723 [2024-11-30 00:05:38.207395] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:12.723 [2024-11-30 00:05:38.207505] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:12.723 [2024-11-30 00:05:38.207717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a00002d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.207741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.207797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.207811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.723 #46 NEW cov: 11896 ft: 15155 corp: 33/621b lim: 30 exec/s: 46 rss: 70Mb L: 15/29 MS: 1 CrossOver- 00:07:12.723 [2024-11-30 00:05:38.247534] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (129260) > buf size (4096) 00:07:12.723 [2024-11-30 00:05:38.247760] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (49920) > len (4) 00:07:12.723 [2024-11-30 00:05:38.247955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:7e3a003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.247980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.248037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.248051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.723 [2024-11-30 00:05:38.248104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.723 [2024-11-30 00:05:38.248118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.723 #47 NEW cov: 11896 ft: 15160 corp: 34/639b lim: 30 exec/s: 47 rss: 70Mb L: 18/29 MS: 1 ChangeByte- 00:07:13.036 [2024-11-30 00:05:38.287614] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (59396) > buf size (4096) 00:07:13.036 [2024-11-30 00:05:38.287921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.036 [2024-11-30 00:05:38.287946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.036 [2024-11-30 00:05:38.287999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000bc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.036 [2024-11-30 00:05:38.288013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.036 #48 NEW cov: 11896 ft: 15175 corp: 35/655b lim: 30 exec/s: 24 rss: 70Mb L: 16/29 MS: 1 InsertByte- 00:07:13.036 #48 DONE cov: 11896 ft: 15175 corp: 35/655b lim: 30 exec/s: 24 rss: 70Mb 00:07:13.036 ###### Recommended dictionary. ###### 00:07:13.036 "\004\000\000\000" # Uses: 1 00:07:13.036 "\000\224F\355_zb\372" # Uses: 0 00:07:13.036 "\000\000\000\000\002\036IM" # Uses: 0 00:07:13.036 ###### End of recommended dictionary. ###### 00:07:13.036 Done 48 runs in 2 second(s) 00:07:13.036 00:05:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:13.036 00:05:38 -- ../common.sh@72 -- # (( i++ )) 00:07:13.036 00:05:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:13.036 00:05:38 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:13.036 00:05:38 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:13.036 00:05:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:13.036 00:05:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:13.036 00:05:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:13.036 00:05:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:13.036 00:05:38 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:13.036 00:05:38 -- nvmf/run.sh@29 -- # port=4402 00:07:13.036 00:05:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:13.036 00:05:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:13.036 00:05:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.036 00:05:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:13.036 [2024-11-30 00:05:38.474001] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:13.036 [2024-11-30 00:05:38.474088] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2721148 ] 00:07:13.036 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.295 [2024-11-30 00:05:38.739124] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.295 [2024-11-30 00:05:38.820648] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:13.295 [2024-11-30 00:05:38.820768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.554 [2024-11-30 00:05:38.879089] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:13.554 [2024-11-30 00:05:38.895450] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:13.554 INFO: Running with entropic power schedule (0xFF, 100). 00:07:13.554 INFO: Seed: 1316093703 00:07:13.554 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:13.554 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:13.554 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:13.554 INFO: A corpus is not provided, starting from an empty corpus 00:07:13.554 #2 INITED exec/s: 0 rss: 60Mb 00:07:13.554 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:13.554 This may also happen if the target rejected all inputs we tried so far 00:07:13.554 [2024-11-30 00:05:38.966047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.554 [2024-11-30 00:05:38.966086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.554 [2024-11-30 00:05:38.966160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.554 [2024-11-30 00:05:38.966178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.554 [2024-11-30 00:05:38.966247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.554 [2024-11-30 00:05:38.966264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.830 NEW_FUNC[1/670]: 0x43db78 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:13.830 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:13.830 #4 NEW cov: 11580 ft: 11574 corp: 2/22b lim: 35 exec/s: 0 rss: 68Mb L: 21/21 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:13.830 [2024-11-30 00:05:39.286209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.286247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.286367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.286384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.286506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.286522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.286641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.286660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.830 #10 NEW cov: 11693 ft: 12709 corp: 3/50b lim: 35 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:13.830 [2024-11-30 00:05:39.326050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.326077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.326191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.326208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.326325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.326341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.830 #11 NEW cov: 11699 ft: 13056 corp: 4/71b lim: 35 exec/s: 0 rss: 69Mb L: 21/28 MS: 1 CMP- DE: "\377\007"- 00:07:13.830 [2024-11-30 00:05:39.366215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.366240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.366362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.366380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.830 [2024-11-30 00:05:39.366500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00fffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.830 [2024-11-30 00:05:39.366519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.093 #12 NEW cov: 11784 ft: 13303 corp: 5/92b lim: 35 exec/s: 0 rss: 69Mb L: 21/28 MS: 1 ChangeBit- 00:07:14.093 [2024-11-30 00:05:39.406547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.406573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.406711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.406730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.406822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.406839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.406960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.406978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.093 #13 NEW cov: 11784 ft: 13423 corp: 6/120b lim: 35 exec/s: 0 rss: 69Mb L: 28/28 MS: 1 ShuffleBytes- 00:07:14.093 [2024-11-30 00:05:39.446385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.446414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.446526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.446542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.446668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00fffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.446684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.093 #14 NEW cov: 11784 ft: 13507 corp: 7/141b lim: 35 exec/s: 0 rss: 69Mb L: 21/28 MS: 1 PersAutoDict- DE: "\377\007"- 00:07:14.093 [2024-11-30 00:05:39.486834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.486861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.486980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.486995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.487109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac000c cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.487126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.487233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.487249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.093 #15 NEW cov: 11784 ft: 13551 corp: 8/169b lim: 35 exec/s: 0 rss: 69Mb L: 28/28 MS: 1 ChangeByte- 00:07:14.093 [2024-11-30 00:05:39.536914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.536941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.537059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.537074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.537184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0aac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.537202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.537321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.537339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.093 #16 NEW cov: 11784 ft: 13610 corp: 9/198b lim: 35 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CrossOver- 00:07:14.093 [2024-11-30 00:05:39.577054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.577080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.577199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:001c00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.093 [2024-11-30 00:05:39.577216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.093 [2024-11-30 00:05:39.577327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.094 [2024-11-30 00:05:39.577346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.094 [2024-11-30 00:05:39.577469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.094 [2024-11-30 00:05:39.577485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.094 #17 NEW cov: 11784 ft: 13689 corp: 10/226b lim: 35 exec/s: 0 rss: 69Mb L: 28/29 MS: 1 ChangeBinInt- 00:07:14.094 [2024-11-30 00:05:39.617142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.094 [2024-11-30 00:05:39.617168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.094 [2024-11-30 00:05:39.617280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.094 [2024-11-30 00:05:39.617297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.094 [2024-11-30 00:05:39.617406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ac0c00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.094 [2024-11-30 00:05:39.617424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.094 [2024-11-30 00:05:39.617542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.094 [2024-11-30 00:05:39.617559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.094 #18 NEW cov: 11784 ft: 13735 corp: 11/256b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 CrossOver- 00:07:14.353 [2024-11-30 00:05:39.656663] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:14.353 [2024-11-30 00:05:39.657295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:0000acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.353 [2024-11-30 00:05:39.657322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.353 [2024-11-30 00:05:39.657440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.353 [2024-11-30 00:05:39.657460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.353 [2024-11-30 00:05:39.657584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac001c cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.353 [2024-11-30 00:05:39.657606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.353 [2024-11-30 00:05:39.657721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.353 [2024-11-30 00:05:39.657738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.353 NEW_FUNC[1/1]: 0x111af98 in nvmf_ctrlr_identify_iocs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3051 00:07:14.353 #19 NEW cov: 11812 ft: 13828 corp: 12/289b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:14.354 [2024-11-30 00:05:39.707262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.707289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.707401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:0700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.707418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.707535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.707551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.354 #20 NEW cov: 11812 ft: 13842 corp: 13/312b lim: 35 exec/s: 0 rss: 69Mb L: 23/33 MS: 1 PersAutoDict- DE: "\377\007"- 00:07:14.354 [2024-11-30 00:05:39.747321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.747348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.747481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.747500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.747621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.747638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.354 #21 NEW cov: 11812 ft: 13875 corp: 14/333b lim: 35 exec/s: 0 rss: 69Mb L: 21/33 MS: 1 CopyPart- 00:07:14.354 [2024-11-30 00:05:39.787728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.787754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.787888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:001c00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.787923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.788035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.788055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.788173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.788191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.354 #22 NEW cov: 11812 ft: 13898 corp: 15/361b lim: 35 exec/s: 0 rss: 69Mb L: 28/33 MS: 1 ChangeBit- 00:07:14.354 [2024-11-30 00:05:39.827797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ff00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.827824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.827948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.827966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.828075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.828092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.828211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.828230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.354 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:14.354 #23 NEW cov: 11835 ft: 13934 corp: 16/391b lim: 35 exec/s: 0 rss: 69Mb L: 30/33 MS: 1 PersAutoDict- DE: "\377\007"- 00:07:14.354 [2024-11-30 00:05:39.867976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.868003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.868117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.868136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.868254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac000c cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.868272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.868394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.868412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.354 #24 NEW cov: 11835 ft: 13978 corp: 17/419b lim: 35 exec/s: 0 rss: 69Mb L: 28/33 MS: 1 ChangeBit- 00:07:14.354 [2024-11-30 00:05:39.907884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.907913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.908033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.908054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.354 [2024-11-30 00:05:39.908169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.354 [2024-11-30 00:05:39.908186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 #25 NEW cov: 11835 ft: 13986 corp: 18/440b lim: 35 exec/s: 25 rss: 70Mb L: 21/33 MS: 1 ChangeByte- 00:07:14.614 [2024-11-30 00:05:39.947953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00bf cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.947981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:39.948101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.948118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:39.948241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.948259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 #26 NEW cov: 11835 ft: 14014 corp: 19/461b lim: 35 exec/s: 26 rss: 70Mb L: 21/33 MS: 1 ChangeBit- 00:07:14.614 [2024-11-30 00:05:39.988259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:2100ff21 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.988286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:39.988407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:21210021 cdw11:ff0021ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.988426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:39.988562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.988581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:39.988706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:39.988724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.614 #27 NEW cov: 11835 ft: 14039 corp: 20/489b lim: 35 exec/s: 27 rss: 70Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:07:14.614 [2024-11-30 00:05:40.028323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.028350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:40.028466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff0007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.028485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:40.028604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.028624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 #28 NEW cov: 11835 ft: 14098 corp: 21/514b lim: 35 exec/s: 28 rss: 70Mb L: 25/33 MS: 1 PersAutoDict- DE: "\377\007"- 00:07:14.614 [2024-11-30 00:05:40.078423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:4100ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.078453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:40.078563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.078581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:40.078693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.078711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 #29 NEW cov: 11835 ft: 14125 corp: 22/536b lim: 35 exec/s: 29 rss: 70Mb L: 22/33 MS: 1 InsertByte- 00:07:14.614 [2024-11-30 00:05:40.128615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.128642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:40.128751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff0700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.128769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.614 [2024-11-30 00:05:40.128881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3fff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.614 [2024-11-30 00:05:40.128900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.614 #30 NEW cov: 11835 ft: 14136 corp: 23/559b lim: 35 exec/s: 30 rss: 70Mb L: 23/33 MS: 1 PersAutoDict- DE: "\377\007"- 00:07:14.874 [2024-11-30 00:05:40.179118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:baba000a cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.179147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.874 [2024-11-30 00:05:40.179267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:baba00ba cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.179284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.874 [2024-11-30 00:05:40.179405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:baba00ba cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.179423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.874 [2024-11-30 00:05:40.179538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:baba00ba cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.179555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.874 [2024-11-30 00:05:40.179664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:baba00ba cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.179685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.874 #31 NEW cov: 11835 ft: 14233 corp: 24/594b lim: 35 exec/s: 31 rss: 70Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:14.874 [2024-11-30 00:05:40.218853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.218880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.874 [2024-11-30 00:05:40.218994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffe70007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.874 [2024-11-30 00:05:40.219013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.219122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.219141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.875 #32 NEW cov: 11835 ft: 14269 corp: 25/615b lim: 35 exec/s: 32 rss: 70Mb L: 21/35 MS: 1 ChangeByte- 00:07:14.875 [2024-11-30 00:05:40.259167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.259195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.259312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ab00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.259331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.259446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac000c cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.259466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.259578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.259595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.875 #33 NEW cov: 11835 ft: 14297 corp: 26/643b lim: 35 exec/s: 33 rss: 70Mb L: 28/35 MS: 1 ChangeBinInt- 00:07:14.875 [2024-11-30 00:05:40.299095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:2900ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.299122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.299231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.299252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.299363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.299380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.875 #34 NEW cov: 11835 ft: 14318 corp: 27/664b lim: 35 exec/s: 34 rss: 70Mb L: 21/35 MS: 1 ChangeByte- 00:07:14.875 [2024-11-30 00:05:40.339192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00b8 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.339219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.339336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:07ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.339355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.339472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff003f cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.339489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.875 #35 NEW cov: 11835 ft: 14357 corp: 28/686b lim: 35 exec/s: 35 rss: 70Mb L: 22/35 MS: 1 InsertByte- 00:07:14.875 [2024-11-30 00:05:40.389416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:2900ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.389444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.389556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:0000ff15 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.389575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.389697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.389715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.875 #36 NEW cov: 11835 ft: 14369 corp: 29/707b lim: 35 exec/s: 36 rss: 70Mb L: 21/35 MS: 1 ChangeBinInt- 00:07:14.875 [2024-11-30 00:05:40.429712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:d900d9d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.429739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.875 [2024-11-30 00:05:40.429852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d9d900d9 cdw11:d900d9d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.875 [2024-11-30 00:05:40.429869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.134 [2024-11-30 00:05:40.429989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:410700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.134 [2024-11-30 00:05:40.430008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.134 [2024-11-30 00:05:40.430122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.430141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.135 #37 NEW cov: 11835 ft: 14422 corp: 30/739b lim: 35 exec/s: 37 rss: 70Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:07:15.135 [2024-11-30 00:05:40.479827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:aeac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.479855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.479975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.479996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.480078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0aac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.480095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.480210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.480230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.135 #38 NEW cov: 11835 ft: 14431 corp: 31/768b lim: 35 exec/s: 38 rss: 70Mb L: 29/35 MS: 1 ChangeBit- 00:07:15.135 [2024-11-30 00:05:40.529504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.529532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.529644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.529662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.135 #39 NEW cov: 11835 ft: 14646 corp: 32/787b lim: 35 exec/s: 39 rss: 70Mb L: 19/35 MS: 1 EraseBytes- 00:07:15.135 [2024-11-30 00:05:40.570165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:acac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.570193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.570312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:b300acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.570330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.570450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:acac000c cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.570469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.570588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.570609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.135 #40 NEW cov: 11835 ft: 14671 corp: 33/815b lim: 35 exec/s: 40 rss: 70Mb L: 28/35 MS: 1 ChangeBinInt- 00:07:15.135 [2024-11-30 00:05:40.620114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff0029ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.620142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.620257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffe70007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.620274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.620387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.620408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.135 #41 NEW cov: 11835 ft: 14686 corp: 34/836b lim: 35 exec/s: 41 rss: 70Mb L: 21/35 MS: 1 ChangeByte- 00:07:15.135 [2024-11-30 00:05:40.660212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.660238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.660348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.660365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.135 [2024-11-30 00:05:40.660479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00fffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.135 [2024-11-30 00:05:40.660498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.135 #42 NEW cov: 11835 ft: 14710 corp: 35/857b lim: 35 exec/s: 42 rss: 70Mb L: 21/35 MS: 1 CopyPart- 00:07:15.395 [2024-11-30 00:05:40.700323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.700350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.700461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.700478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.700591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.700610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.395 #43 NEW cov: 11835 ft: 14714 corp: 36/879b lim: 35 exec/s: 43 rss: 70Mb L: 22/35 MS: 1 InsertByte- 00:07:15.395 [2024-11-30 00:05:40.740685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:aeac000a cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.740710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.740825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.740850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.740966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0aac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.740984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.741095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:acac00ac cdw11:ac00acac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.741113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.395 #44 NEW cov: 11835 ft: 14752 corp: 37/909b lim: 35 exec/s: 44 rss: 70Mb L: 30/35 MS: 1 InsertByte- 00:07:15.395 [2024-11-30 00:05:40.780788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:d90026d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.780814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.780927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d9d900d9 cdw11:d900d9d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.780943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.781067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:410700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.781084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.781195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.781214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.395 #45 NEW cov: 11835 ft: 14771 corp: 38/941b lim: 35 exec/s: 45 rss: 70Mb L: 32/35 MS: 1 ChangeByte- 00:07:15.395 [2024-11-30 00:05:40.820653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fff700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.820678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.820784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.820801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.820915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.820932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.395 #46 NEW cov: 11835 ft: 14775 corp: 39/962b lim: 35 exec/s: 46 rss: 70Mb L: 21/35 MS: 1 ChangeBit- 00:07:15.395 [2024-11-30 00:05:40.860540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.860566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.860681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff0700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.860699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 #47 NEW cov: 11835 ft: 14783 corp: 40/977b lim: 35 exec/s: 47 rss: 70Mb L: 15/35 MS: 1 CrossOver- 00:07:15.395 [2024-11-30 00:05:40.901045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:07ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.901071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.901188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0007 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.901206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.901315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.901335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.395 #48 NEW cov: 11835 ft: 14790 corp: 41/1000b lim: 35 exec/s: 48 rss: 70Mb L: 23/35 MS: 1 PersAutoDict- DE: "\377\007"- 00:07:15.395 [2024-11-30 00:05:40.941070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.941097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.941218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff0700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.941235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.395 [2024-11-30 00:05:40.941354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3fff00ff cdw11:1700ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.395 [2024-11-30 00:05:40.941372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.654 #49 NEW cov: 11835 ft: 14822 corp: 42/1023b lim: 35 exec/s: 24 rss: 70Mb L: 23/35 MS: 1 ChangeBinInt- 00:07:15.654 #49 DONE cov: 11835 ft: 14822 corp: 42/1023b lim: 35 exec/s: 24 rss: 70Mb 00:07:15.654 ###### Recommended dictionary. ###### 00:07:15.654 "\377\007" # Uses: 6 00:07:15.654 ###### End of recommended dictionary. ###### 00:07:15.654 Done 49 runs in 2 second(s) 00:07:15.654 00:05:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:15.654 00:05:41 -- ../common.sh@72 -- # (( i++ )) 00:07:15.654 00:05:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:15.654 00:05:41 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:15.654 00:05:41 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:15.654 00:05:41 -- nvmf/run.sh@24 -- # local timen=1 00:07:15.654 00:05:41 -- nvmf/run.sh@25 -- # local core=0x1 00:07:15.654 00:05:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:15.654 00:05:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:15.654 00:05:41 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:15.654 00:05:41 -- nvmf/run.sh@29 -- # port=4403 00:07:15.654 00:05:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:15.654 00:05:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:15.654 00:05:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:15.654 00:05:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:15.654 [2024-11-30 00:05:41.135729] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:15.654 [2024-11-30 00:05:41.135793] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2721695 ] 00:07:15.654 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.913 [2024-11-30 00:05:41.387088] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.173 [2024-11-30 00:05:41.476997] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:16.173 [2024-11-30 00:05:41.477139] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.173 [2024-11-30 00:05:41.535169] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.173 [2024-11-30 00:05:41.551525] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:16.173 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.173 INFO: Seed: 3974121522 00:07:16.173 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:16.173 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:16.173 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:16.173 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.173 #2 INITED exec/s: 0 rss: 60Mb 00:07:16.173 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:16.173 This may also happen if the target rejected all inputs we tried so far 00:07:16.433 NEW_FUNC[1/659]: 0x43f858 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:16.433 NEW_FUNC[2/659]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:16.433 #5 NEW cov: 11493 ft: 11491 corp: 2/15b lim: 20 exec/s: 0 rss: 68Mb L: 14/14 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:07:16.433 #12 NEW cov: 11623 ft: 12251 corp: 3/33b lim: 20 exec/s: 0 rss: 68Mb L: 18/18 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:16.692 #13 NEW cov: 11629 ft: 12579 corp: 4/52b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 InsertByte- 00:07:16.692 #14 NEW cov: 11715 ft: 13099 corp: 5/61b lim: 20 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 CMP- DE: "|\026\016H\237\177\000\000"- 00:07:16.692 #15 NEW cov: 11715 ft: 13216 corp: 6/70b lim: 20 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 ChangeByte- 00:07:16.692 #16 NEW cov: 11715 ft: 13313 corp: 7/80b lim: 20 exec/s: 0 rss: 68Mb L: 10/19 MS: 1 InsertByte- 00:07:16.692 #17 NEW cov: 11715 ft: 13454 corp: 8/100b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertByte- 00:07:16.692 [2024-11-30 00:05:42.168633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:16.692 [2024-11-30 00:05:42.168671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.692 NEW_FUNC[1/17]: 0x111e188 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:16.692 NEW_FUNC[2/17]: 0x111ed08 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:16.692 #23 NEW cov: 11959 ft: 13848 corp: 9/120b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:16.692 #24 NEW cov: 11959 ft: 13898 corp: 10/138b lim: 20 exec/s: 0 rss: 68Mb L: 18/20 MS: 1 ChangeByte- 00:07:16.953 #25 NEW cov: 11959 ft: 14009 corp: 11/158b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:16.953 #26 NEW cov: 11959 ft: 14051 corp: 12/178b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:16.953 #27 NEW cov: 11959 ft: 14119 corp: 13/198b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:16.953 #28 NEW cov: 11959 ft: 14218 corp: 14/217b lim: 20 exec/s: 0 rss: 69Mb L: 19/20 MS: 1 ChangeByte- 00:07:16.953 #29 NEW cov: 11959 ft: 14257 corp: 15/235b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 ChangeByte- 00:07:16.953 #30 NEW cov: 11959 ft: 14272 corp: 16/250b lim: 20 exec/s: 0 rss: 69Mb L: 15/20 MS: 1 CrossOver- 00:07:17.215 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:17.215 #31 NEW cov: 11982 ft: 14313 corp: 17/270b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:17.215 #32 NEW cov: 11982 ft: 14393 corp: 18/283b lim: 20 exec/s: 0 rss: 69Mb L: 13/20 MS: 1 EraseBytes- 00:07:17.215 [2024-11-30 00:05:42.589884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.215 [2024-11-30 00:05:42.589915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.215 #33 NEW cov: 11982 ft: 14415 corp: 19/303b lim: 20 exec/s: 33 rss: 69Mb L: 20/20 MS: 1 ChangeByte- 00:07:17.215 #34 NEW cov: 11982 ft: 14424 corp: 20/323b lim: 20 exec/s: 34 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:17.215 #35 NEW cov: 11982 ft: 14436 corp: 21/333b lim: 20 exec/s: 35 rss: 69Mb L: 10/20 MS: 1 ChangeBinInt- 00:07:17.215 #36 NEW cov: 11982 ft: 14476 corp: 22/353b lim: 20 exec/s: 36 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:17.475 #37 NEW cov: 11982 ft: 14482 corp: 23/373b lim: 20 exec/s: 37 rss: 69Mb L: 20/20 MS: 1 ChangeByte- 00:07:17.475 #38 NEW cov: 11982 ft: 14513 corp: 24/393b lim: 20 exec/s: 38 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:17.475 #39 NEW cov: 11982 ft: 14540 corp: 25/401b lim: 20 exec/s: 39 rss: 69Mb L: 8/20 MS: 1 EraseBytes- 00:07:17.475 #40 NEW cov: 11982 ft: 14638 corp: 26/421b lim: 20 exec/s: 40 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:17.475 #41 NEW cov: 11982 ft: 14640 corp: 27/429b lim: 20 exec/s: 41 rss: 70Mb L: 8/20 MS: 1 ChangeBit- 00:07:17.475 #42 NEW cov: 11982 ft: 14655 corp: 28/445b lim: 20 exec/s: 42 rss: 70Mb L: 16/20 MS: 1 InsertByte- 00:07:17.475 #43 NEW cov: 11982 ft: 14667 corp: 29/465b lim: 20 exec/s: 43 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:17.734 #44 NEW cov: 11982 ft: 14682 corp: 30/473b lim: 20 exec/s: 44 rss: 70Mb L: 8/20 MS: 1 ShuffleBytes- 00:07:17.734 #45 NEW cov: 11982 ft: 14691 corp: 31/483b lim: 20 exec/s: 45 rss: 70Mb L: 10/20 MS: 1 CrossOver- 00:07:17.734 #46 NEW cov: 11982 ft: 14721 corp: 32/493b lim: 20 exec/s: 46 rss: 70Mb L: 10/20 MS: 1 CrossOver- 00:07:17.734 #47 NEW cov: 11982 ft: 15006 corp: 33/500b lim: 20 exec/s: 47 rss: 70Mb L: 7/20 MS: 1 CrossOver- 00:07:17.734 #48 NEW cov: 11982 ft: 15035 corp: 34/519b lim: 20 exec/s: 48 rss: 70Mb L: 19/20 MS: 1 ShuffleBytes- 00:07:17.734 #49 NEW cov: 11982 ft: 15043 corp: 35/539b lim: 20 exec/s: 49 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:17.993 #50 NEW cov: 11982 ft: 15053 corp: 36/549b lim: 20 exec/s: 50 rss: 70Mb L: 10/20 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:17.993 #51 NEW cov: 11982 ft: 15097 corp: 37/565b lim: 20 exec/s: 51 rss: 70Mb L: 16/20 MS: 1 EraseBytes- 00:07:17.993 #52 NEW cov: 11982 ft: 15109 corp: 38/583b lim: 20 exec/s: 52 rss: 70Mb L: 18/20 MS: 1 PersAutoDict- DE: "|\026\016H\237\177\000\000"- 00:07:17.993 #53 NEW cov: 11982 ft: 15124 corp: 39/603b lim: 20 exec/s: 53 rss: 70Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:17.993 #54 NEW cov: 11982 ft: 15131 corp: 40/616b lim: 20 exec/s: 54 rss: 70Mb L: 13/20 MS: 1 CrossOver- 00:07:17.993 #55 NEW cov: 11982 ft: 15133 corp: 41/636b lim: 20 exec/s: 55 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:17.993 [2024-11-30 00:05:43.522360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.993 [2024-11-30 00:05:43.522392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.253 #56 NEW cov: 11982 ft: 15155 corp: 42/655b lim: 20 exec/s: 56 rss: 70Mb L: 19/20 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:18.253 #57 NEW cov: 11982 ft: 15161 corp: 43/671b lim: 20 exec/s: 28 rss: 70Mb L: 16/20 MS: 1 CrossOver- 00:07:18.253 #57 DONE cov: 11982 ft: 15161 corp: 43/671b lim: 20 exec/s: 28 rss: 70Mb 00:07:18.253 ###### Recommended dictionary. ###### 00:07:18.253 "|\026\016H\237\177\000\000" # Uses: 1 00:07:18.253 "\000\000\000\000" # Uses: 1 00:07:18.253 ###### End of recommended dictionary. ###### 00:07:18.253 Done 57 runs in 2 second(s) 00:07:18.253 00:05:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:18.253 00:05:43 -- ../common.sh@72 -- # (( i++ )) 00:07:18.253 00:05:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:18.253 00:05:43 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:18.253 00:05:43 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:18.253 00:05:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:18.253 00:05:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:18.253 00:05:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:18.253 00:05:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:18.253 00:05:43 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:18.253 00:05:43 -- nvmf/run.sh@29 -- # port=4404 00:07:18.253 00:05:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:18.253 00:05:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:18.253 00:05:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:18.253 00:05:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:18.253 [2024-11-30 00:05:43.757866] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:18.253 [2024-11-30 00:05:43.757937] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2722235 ] 00:07:18.253 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.512 [2024-11-30 00:05:44.005793] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.770 [2024-11-30 00:05:44.092551] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:18.770 [2024-11-30 00:05:44.092701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.770 [2024-11-30 00:05:44.150768] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:18.770 [2024-11-30 00:05:44.167147] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:18.770 INFO: Running with entropic power schedule (0xFF, 100). 00:07:18.770 INFO: Seed: 2293144624 00:07:18.770 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:18.770 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:18.770 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:18.770 INFO: A corpus is not provided, starting from an empty corpus 00:07:18.770 #2 INITED exec/s: 0 rss: 61Mb 00:07:18.770 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:18.770 This may also happen if the target rejected all inputs we tried so far 00:07:18.770 [2024-11-30 00:05:44.216558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.770 [2024-11-30 00:05:44.216589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.770 [2024-11-30 00:05:44.216662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.770 [2024-11-30 00:05:44.216681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.770 [2024-11-30 00:05:44.216746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.770 [2024-11-30 00:05:44.216765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.770 [2024-11-30 00:05:44.216831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.770 [2024-11-30 00:05:44.216850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.028 NEW_FUNC[1/670]: 0x440958 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:19.029 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:19.029 #9 NEW cov: 11600 ft: 11602 corp: 2/34b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:19.029 [2024-11-30 00:05:44.537249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.029 [2024-11-30 00:05:44.537284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.029 [2024-11-30 00:05:44.537348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.029 [2024-11-30 00:05:44.537378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.029 [2024-11-30 00:05:44.537443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.029 [2024-11-30 00:05:44.537461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.029 [2024-11-30 00:05:44.537524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.029 [2024-11-30 00:05:44.537540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.029 NEW_FUNC[1/1]: 0xedf2f8 in rte_get_tsc_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/include/rte_cycles.h:61 00:07:19.029 #10 NEW cov: 11714 ft: 12031 corp: 3/67b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBit- 00:07:19.289 [2024-11-30 00:05:44.587300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.587329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.587394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.587413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.587479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.587498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.587560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.587576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.289 #21 NEW cov: 11720 ft: 12350 corp: 4/100b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CopyPart- 00:07:19.289 [2024-11-30 00:05:44.627377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.627407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.627473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.627492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.627554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.627571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.627637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.627653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.289 #22 NEW cov: 11805 ft: 12517 corp: 5/133b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:19.289 [2024-11-30 00:05:44.667550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.667577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.667648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.667667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.667731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.667748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.667811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.667827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.289 #23 NEW cov: 11805 ft: 12684 corp: 6/166b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 CrossOver- 00:07:19.289 [2024-11-30 00:05:44.707491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.707518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.707584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.707607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.707671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.707691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 #24 NEW cov: 11805 ft: 13025 corp: 7/193b lim: 35 exec/s: 0 rss: 69Mb L: 27/33 MS: 1 EraseBytes- 00:07:19.289 [2024-11-30 00:05:44.747726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.747754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.747820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fff70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.747840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.747902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.747919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.747984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.748000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.289 #25 NEW cov: 11805 ft: 13079 corp: 8/226b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 ChangeBit- 00:07:19.289 [2024-11-30 00:05:44.788034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.788063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.788130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.788151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.788214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.788234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.788299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.788317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.788381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.788399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.289 #26 NEW cov: 11805 ft: 13169 corp: 9/261b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:19.289 [2024-11-30 00:05:44.827936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.827964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.828030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fff70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.828049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.828104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.828122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.289 [2024-11-30 00:05:44.828185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.289 [2024-11-30 00:05:44.828200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.568 #32 NEW cov: 11805 ft: 13195 corp: 10/294b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 CopyPart- 00:07:19.568 [2024-11-30 00:05:44.868077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.568 [2024-11-30 00:05:44.868104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.568 [2024-11-30 00:05:44.868166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.568 [2024-11-30 00:05:44.868185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.568 [2024-11-30 00:05:44.868237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.568 [2024-11-30 00:05:44.868260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.568 [2024-11-30 00:05:44.868321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.568 [2024-11-30 00:05:44.868337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.568 #33 NEW cov: 11805 ft: 13262 corp: 11/327b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 CrossOver- 00:07:19.568 [2024-11-30 00:05:44.908174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.568 [2024-11-30 00:05:44.908201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:44.908265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.908284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:44.908347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.908364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:44.908420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:73380000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.908436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.569 #34 NEW cov: 11805 ft: 13317 corp: 12/360b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 CMP- DE: "s8\016\344\004\177\000\000"- 00:07:19.569 [2024-11-30 00:05:44.948147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.948174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:44.948239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.948257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:44.948318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.948335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.569 #35 NEW cov: 11805 ft: 13404 corp: 13/387b lim: 35 exec/s: 0 rss: 69Mb L: 27/35 MS: 1 CrossOver- 00:07:19.569 [2024-11-30 00:05:44.988140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.988166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:44.988233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:44.988252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.569 #36 NEW cov: 11805 ft: 13730 corp: 14/404b lim: 35 exec/s: 0 rss: 69Mb L: 17/35 MS: 1 EraseBytes- 00:07:19.569 [2024-11-30 00:05:45.028518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.028547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:45.028617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.028637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:45.028702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.028722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:45.028786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.028806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.569 #37 NEW cov: 11805 ft: 13752 corp: 15/437b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 ChangeByte- 00:07:19.569 [2024-11-30 00:05:45.068619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.068647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:45.068714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.068734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:45.068798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.068818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.569 [2024-11-30 00:05:45.068884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.569 [2024-11-30 00:05:45.068900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.569 #38 NEW cov: 11805 ft: 13758 corp: 16/470b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 CopyPart- 00:07:19.886 [2024-11-30 00:05:45.108761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.108789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.108856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.108875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.108940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.108957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.109023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:73310000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.109043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.886 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:19.886 #39 NEW cov: 11828 ft: 13796 corp: 17/503b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 ChangeASCIIInt- 00:07:19.886 [2024-11-30 00:05:45.158914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.158942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.159007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.159026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.159089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.159105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.159167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:73310000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.159182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.886 #40 NEW cov: 11828 ft: 13831 corp: 18/537b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 InsertByte- 00:07:19.886 [2024-11-30 00:05:45.198874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.198901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.198967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.198985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.199048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.199065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 #44 NEW cov: 11828 ft: 13852 corp: 19/562b lim: 35 exec/s: 44 rss: 69Mb L: 25/35 MS: 4 CopyPart-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:07:19.886 [2024-11-30 00:05:45.238980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21fd cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.239007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.239072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.239091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.239155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.239171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 #45 NEW cov: 11828 ft: 13864 corp: 20/589b lim: 35 exec/s: 45 rss: 69Mb L: 27/35 MS: 1 ChangeBit- 00:07:19.886 [2024-11-30 00:05:45.279253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.279280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.279343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.279362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.279425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.279441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.279504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:73380000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.279520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.886 #46 NEW cov: 11828 ft: 13879 corp: 21/622b lim: 35 exec/s: 46 rss: 69Mb L: 33/35 MS: 1 ShuffleBytes- 00:07:19.886 [2024-11-30 00:05:45.319308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.319334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.319401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7338ffff cdw11:0ee40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.319422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.319486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ff7f00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.319506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.319569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.319585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.886 #47 NEW cov: 11828 ft: 13928 corp: 22/655b lim: 35 exec/s: 47 rss: 69Mb L: 33/35 MS: 1 PersAutoDict- DE: "s8\016\344\004\177\000\000"- 00:07:19.886 [2024-11-30 00:05:45.359444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffbf0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.359470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.359535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.359554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.359622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.359640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.359705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:73310000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.359721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.886 #48 NEW cov: 11828 ft: 13953 corp: 23/688b lim: 35 exec/s: 48 rss: 69Mb L: 33/35 MS: 1 ChangeBit- 00:07:19.886 [2024-11-30 00:05:45.399570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.399601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.399666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7338ffff cdw11:0ee40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.399686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.399748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ff7f00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.399765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.886 [2024-11-30 00:05:45.399830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.886 [2024-11-30 00:05:45.399846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.152 #49 NEW cov: 11828 ft: 14005 corp: 24/721b lim: 35 exec/s: 49 rss: 69Mb L: 33/35 MS: 1 ChangeBinInt- 00:07:20.152 [2024-11-30 00:05:45.439736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.439763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.439829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.439849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.439913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.439930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.439993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.440009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.152 #50 NEW cov: 11828 ft: 14007 corp: 25/754b lim: 35 exec/s: 50 rss: 69Mb L: 33/35 MS: 1 ChangeBit- 00:07:20.152 [2024-11-30 00:05:45.479936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.479963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.480025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.480044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.480110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.480126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.480189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff730000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.480205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.480267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:e4040e2f cdw11:7f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.480282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.152 #51 NEW cov: 11828 ft: 14014 corp: 26/789b lim: 35 exec/s: 51 rss: 70Mb L: 35/35 MS: 1 InsertByte- 00:07:20.152 [2024-11-30 00:05:45.519453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.519479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.152 #52 NEW cov: 11828 ft: 14762 corp: 27/801b lim: 35 exec/s: 52 rss: 70Mb L: 12/35 MS: 1 CrossOver- 00:07:20.152 [2024-11-30 00:05:45.560016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.560043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.560107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffffeff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.560126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.560177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.560196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.152 [2024-11-30 00:05:45.560258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.560274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.152 #53 NEW cov: 11828 ft: 14798 corp: 28/834b lim: 35 exec/s: 53 rss: 70Mb L: 33/35 MS: 1 ChangeBinInt- 00:07:20.152 [2024-11-30 00:05:45.600199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.152 [2024-11-30 00:05:45.600226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.600291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.600310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.600372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:fffff7ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.600392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.600458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.600474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.153 #54 NEW cov: 11828 ft: 14807 corp: 29/867b lim: 35 exec/s: 54 rss: 70Mb L: 33/35 MS: 1 ShuffleBytes- 00:07:20.153 [2024-11-30 00:05:45.640380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.640407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.640471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.640490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.640552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.640569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.640637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff730000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.640654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.153 [2024-11-30 00:05:45.640716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:e4040e2f cdw11:7f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.640732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.153 #55 NEW cov: 11828 ft: 14826 corp: 30/902b lim: 35 exec/s: 55 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:20.153 [2024-11-30 00:05:45.679865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002100 cdw11:08ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.153 [2024-11-30 00:05:45.679892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 #56 NEW cov: 11828 ft: 14867 corp: 31/914b lim: 35 exec/s: 56 rss: 70Mb L: 12/35 MS: 1 ChangeBinInt- 00:07:20.431 [2024-11-30 00:05:45.720030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ff120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.720056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 #57 NEW cov: 11828 ft: 14883 corp: 32/926b lim: 35 exec/s: 57 rss: 70Mb L: 12/35 MS: 1 CMP- DE: "\022\000\000\000"- 00:07:20.431 [2024-11-30 00:05:45.760442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21fd cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.760469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.760532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fdffff21 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.760552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.760618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.760643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.431 #58 NEW cov: 11828 ft: 14907 corp: 33/953b lim: 35 exec/s: 58 rss: 70Mb L: 27/35 MS: 1 CopyPart- 00:07:20.431 [2024-11-30 00:05:45.800725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.800752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.800817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff73ffff cdw11:380e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.800835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.800900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000047f cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.800921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.800984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.801002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.431 #64 NEW cov: 11828 ft: 14920 corp: 34/987b lim: 35 exec/s: 64 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:07:20.431 [2024-11-30 00:05:45.840495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.840522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.840588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff210003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.840612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.431 #65 NEW cov: 11828 ft: 14942 corp: 35/1006b lim: 35 exec/s: 65 rss: 70Mb L: 19/35 MS: 1 CrossOver- 00:07:20.431 [2024-11-30 00:05:45.881075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.881103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.881168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff73ffff cdw11:380e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.881188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.881253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7f00e404 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.881272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.881335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.881352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.881416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffe90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.881435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.431 #66 NEW cov: 11828 ft: 14960 corp: 36/1041b lim: 35 exec/s: 66 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:20.431 [2024-11-30 00:05:45.920620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fffd21ff cdw11:ff120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.920647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 #67 NEW cov: 11828 ft: 14969 corp: 37/1053b lim: 35 exec/s: 67 rss: 70Mb L: 12/35 MS: 1 ChangeBit- 00:07:20.431 [2024-11-30 00:05:45.961037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.961063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.961128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.961146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.431 [2024-11-30 00:05:45.961210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:21ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.431 [2024-11-30 00:05:45.961226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.690 #68 NEW cov: 11828 ft: 15015 corp: 38/1074b lim: 35 exec/s: 68 rss: 70Mb L: 21/35 MS: 1 CrossOver- 00:07:20.690 [2024-11-30 00:05:46.001322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.690 [2024-11-30 00:05:46.001348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.690 [2024-11-30 00:05:46.001414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.690 [2024-11-30 00:05:46.001434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.690 [2024-11-30 00:05:46.001498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.690 [2024-11-30 00:05:46.001515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.690 [2024-11-30 00:05:46.001578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:73c80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.690 [2024-11-30 00:05:46.001594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.690 #69 NEW cov: 11828 ft: 15023 corp: 39/1107b lim: 35 exec/s: 69 rss: 70Mb L: 33/35 MS: 1 ChangeBinInt- 00:07:20.690 [2024-11-30 00:05:46.040976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:fff70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.690 [2024-11-30 00:05:46.041004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.690 #71 NEW cov: 11828 ft: 15035 corp: 40/1120b lim: 35 exec/s: 71 rss: 70Mb L: 13/35 MS: 2 CrossOver-CrossOver- 00:07:20.690 [2024-11-30 00:05:46.081246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.690 [2024-11-30 00:05:46.081273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.081341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffb90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.081362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.691 #72 NEW cov: 11828 ft: 15074 corp: 41/1134b lim: 35 exec/s: 72 rss: 70Mb L: 14/35 MS: 1 EraseBytes- 00:07:20.691 [2024-11-30 00:05:46.121630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.121656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.121720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffffeff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.121739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.121804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.121821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.121883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.121899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.691 #73 NEW cov: 11828 ft: 15081 corp: 42/1167b lim: 35 exec/s: 73 rss: 70Mb L: 33/35 MS: 1 ShuffleBytes- 00:07:20.691 [2024-11-30 00:05:46.161914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff21ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.161940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.162005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fff70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.162041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.162104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.162125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.162187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.162203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.162267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.162284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.691 #74 NEW cov: 11828 ft: 15087 corp: 43/1202b lim: 35 exec/s: 74 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:20.691 [2024-11-30 00:05:46.201770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff210a cdw11:ffbf0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.201797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.201865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.201886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.691 [2024-11-30 00:05:46.201947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.691 [2024-11-30 00:05:46.201963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.691 #75 NEW cov: 11828 ft: 15105 corp: 44/1225b lim: 35 exec/s: 37 rss: 70Mb L: 23/35 MS: 1 EraseBytes- 00:07:20.691 #75 DONE cov: 11828 ft: 15105 corp: 44/1225b lim: 35 exec/s: 37 rss: 70Mb 00:07:20.691 ###### Recommended dictionary. ###### 00:07:20.691 "s8\016\344\004\177\000\000" # Uses: 1 00:07:20.691 "\022\000\000\000" # Uses: 0 00:07:20.691 ###### End of recommended dictionary. ###### 00:07:20.691 Done 75 runs in 2 second(s) 00:07:20.950 00:05:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:20.950 00:05:46 -- ../common.sh@72 -- # (( i++ )) 00:07:20.950 00:05:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:20.950 00:05:46 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:20.950 00:05:46 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:20.950 00:05:46 -- nvmf/run.sh@24 -- # local timen=1 00:07:20.950 00:05:46 -- nvmf/run.sh@25 -- # local core=0x1 00:07:20.950 00:05:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:20.950 00:05:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:20.950 00:05:46 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:20.950 00:05:46 -- nvmf/run.sh@29 -- # port=4405 00:07:20.950 00:05:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:20.950 00:05:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:20.950 00:05:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:20.950 00:05:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:20.950 [2024-11-30 00:05:46.390306] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:20.950 [2024-11-30 00:05:46.390384] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2722563 ] 00:07:20.950 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.209 [2024-11-30 00:05:46.639521] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.209 [2024-11-30 00:05:46.720886] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.209 [2024-11-30 00:05:46.721009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.469 [2024-11-30 00:05:46.779370] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:21.469 [2024-11-30 00:05:46.795751] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:21.469 INFO: Running with entropic power schedule (0xFF, 100). 00:07:21.469 INFO: Seed: 629164961 00:07:21.469 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:21.469 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:21.469 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:21.469 INFO: A corpus is not provided, starting from an empty corpus 00:07:21.469 #2 INITED exec/s: 0 rss: 60Mb 00:07:21.469 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:21.469 This may also happen if the target rejected all inputs we tried so far 00:07:21.469 [2024-11-30 00:05:46.851169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.469 [2024-11-30 00:05:46.851200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.469 [2024-11-30 00:05:46.851267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.469 [2024-11-30 00:05:46.851286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.469 [2024-11-30 00:05:46.851349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.469 [2024-11-30 00:05:46.851366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.729 NEW_FUNC[1/670]: 0x442af8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:21.729 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:21.729 #20 NEW cov: 11586 ft: 11610 corp: 2/36b lim: 45 exec/s: 0 rss: 68Mb L: 35/35 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:21.729 [2024-11-30 00:05:47.152057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.152092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.152159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.152179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.152246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.152267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.729 NEW_FUNC[1/1]: 0x4823e8 in malloc_completion_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:849 00:07:21.729 #21 NEW cov: 11725 ft: 12065 corp: 3/71b lim: 45 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:21.729 [2024-11-30 00:05:47.202240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.202267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.202336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.202356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.202420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.202436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.202499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.202514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.729 #22 NEW cov: 11731 ft: 12630 corp: 4/110b lim: 45 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 CMP- DE: "\000\002\000\000"- 00:07:21.729 [2024-11-30 00:05:47.242032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.242063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.242131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.242152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.729 #23 NEW cov: 11816 ft: 13326 corp: 5/135b lim: 45 exec/s: 0 rss: 69Mb L: 25/39 MS: 1 EraseBytes- 00:07:21.729 [2024-11-30 00:05:47.282321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.282350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.282419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.282441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.729 [2024-11-30 00:05:47.282508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.729 [2024-11-30 00:05:47.282524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.989 #24 NEW cov: 11816 ft: 13384 corp: 6/170b lim: 45 exec/s: 0 rss: 69Mb L: 35/39 MS: 1 CrossOver- 00:07:21.989 [2024-11-30 00:05:47.322593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.322624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.989 [2024-11-30 00:05:47.322693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.322714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.989 [2024-11-30 00:05:47.322779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.322795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.989 [2024-11-30 00:05:47.322859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.322875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.989 #25 NEW cov: 11816 ft: 13576 corp: 7/209b lim: 45 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:21.989 [2024-11-30 00:05:47.362708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.362735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.989 [2024-11-30 00:05:47.362802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.362827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.989 [2024-11-30 00:05:47.362893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.362909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.989 [2024-11-30 00:05:47.362974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.989 [2024-11-30 00:05:47.362990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.989 #26 NEW cov: 11816 ft: 13700 corp: 8/251b lim: 45 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 CrossOver- 00:07:21.989 [2024-11-30 00:05:47.412862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.412888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.412957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.412978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.413043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.413063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.413131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.413150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.990 #32 NEW cov: 11816 ft: 13746 corp: 9/290b lim: 45 exec/s: 0 rss: 69Mb L: 39/42 MS: 1 ShuffleBytes- 00:07:21.990 [2024-11-30 00:05:47.452526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff27ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.452552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.990 #35 NEW cov: 11816 ft: 14515 corp: 10/299b lim: 45 exec/s: 0 rss: 69Mb L: 9/42 MS: 3 ShuffleBytes-ChangeByte-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:21.990 [2024-11-30 00:05:47.492959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.492986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.493054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.493073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.493138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.493170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.990 #36 NEW cov: 11816 ft: 14542 corp: 11/334b lim: 45 exec/s: 0 rss: 69Mb L: 35/42 MS: 1 ChangeBit- 00:07:21.990 [2024-11-30 00:05:47.533098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:96964a96 cdw11:96960004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.533124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.533191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:96969696 cdw11:96960004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.533211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.990 [2024-11-30 00:05:47.533276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:96969696 cdw11:96960004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.990 [2024-11-30 00:05:47.533293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.249 #39 NEW cov: 11816 ft: 14571 corp: 12/367b lim: 45 exec/s: 0 rss: 69Mb L: 33/42 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:07:22.249 [2024-11-30 00:05:47.573028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.573055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.249 [2024-11-30 00:05:47.573123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff19ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.573142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.249 #40 NEW cov: 11816 ft: 14618 corp: 13/392b lim: 45 exec/s: 0 rss: 69Mb L: 25/42 MS: 1 ChangeBinInt- 00:07:22.249 [2024-11-30 00:05:47.612957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.612984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.249 #45 NEW cov: 11816 ft: 14690 corp: 14/401b lim: 45 exec/s: 0 rss: 69Mb L: 9/42 MS: 5 ShuffleBytes-PersAutoDict-ChangeBit-CopyPart-CrossOver- DE: "\000\002\000\000"- 00:07:22.249 [2024-11-30 00:05:47.653390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.653417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.249 [2024-11-30 00:05:47.653486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00ff0700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.653506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.249 [2024-11-30 00:05:47.653573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.653589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.249 #46 NEW cov: 11816 ft: 14710 corp: 15/431b lim: 45 exec/s: 0 rss: 69Mb L: 30/42 MS: 1 EraseBytes- 00:07:22.249 [2024-11-30 00:05:47.693386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.249 [2024-11-30 00:05:47.693415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.250 [2024-11-30 00:05:47.693482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.693521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.250 #47 NEW cov: 11816 ft: 14718 corp: 16/457b lim: 45 exec/s: 0 rss: 69Mb L: 26/42 MS: 1 InsertByte- 00:07:22.250 [2024-11-30 00:05:47.733835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.733862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.250 [2024-11-30 00:05:47.733931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.733952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.250 [2024-11-30 00:05:47.734016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.734032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.250 [2024-11-30 00:05:47.734096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff0fff cdw11:ffff0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.734111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.250 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:22.250 #48 NEW cov: 11839 ft: 14772 corp: 17/493b lim: 45 exec/s: 0 rss: 69Mb L: 36/42 MS: 1 InsertByte- 00:07:22.250 [2024-11-30 00:05:47.783828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.783856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.250 [2024-11-30 00:05:47.783925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.783945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.250 [2024-11-30 00:05:47.784009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.250 [2024-11-30 00:05:47.784026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.250 #49 NEW cov: 11839 ft: 14780 corp: 18/528b lim: 45 exec/s: 0 rss: 69Mb L: 35/42 MS: 1 ChangeBinInt- 00:07:22.509 [2024-11-30 00:05:47.823751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.509 [2024-11-30 00:05:47.823778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.509 [2024-11-30 00:05:47.823846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.509 [2024-11-30 00:05:47.823865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.509 #50 NEW cov: 11839 ft: 14837 corp: 19/553b lim: 45 exec/s: 50 rss: 69Mb L: 25/42 MS: 1 EraseBytes- 00:07:22.510 [2024-11-30 00:05:47.864189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.864216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.864286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.864307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.864372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.864388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.864451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.864466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.510 #51 NEW cov: 11839 ft: 14846 corp: 20/592b lim: 45 exec/s: 51 rss: 69Mb L: 39/42 MS: 1 ChangeByte- 00:07:22.510 [2024-11-30 00:05:47.904288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.904315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.904382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.904401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.904458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.904476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.904540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff5effff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.904556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.510 #52 NEW cov: 11839 ft: 14876 corp: 21/628b lim: 45 exec/s: 52 rss: 69Mb L: 36/42 MS: 1 InsertByte- 00:07:22.510 [2024-11-30 00:05:47.944275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.944301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.944370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.944389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:47.944454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.944472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.510 #53 NEW cov: 11839 ft: 14929 corp: 22/663b lim: 45 exec/s: 53 rss: 69Mb L: 35/42 MS: 1 ChangeBinInt- 00:07:22.510 [2024-11-30 00:05:47.984046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:47.984073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.510 #54 NEW cov: 11839 ft: 14935 corp: 23/673b lim: 45 exec/s: 54 rss: 70Mb L: 10/42 MS: 1 CrossOver- 00:07:22.510 [2024-11-30 00:05:48.024676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:48.024703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:48.024770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:48.024789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:48.024854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:48.024870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:48.024933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:48.024950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.510 #55 NEW cov: 11839 ft: 14954 corp: 24/717b lim: 45 exec/s: 55 rss: 70Mb L: 44/44 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:22.510 [2024-11-30 00:05:48.064842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:48.064869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.510 [2024-11-30 00:05:48.064936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.510 [2024-11-30 00:05:48.064954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.065023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.065040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.065106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.065122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.770 #56 NEW cov: 11839 ft: 14983 corp: 25/756b lim: 45 exec/s: 56 rss: 70Mb L: 39/44 MS: 1 ChangeBinInt- 00:07:22.770 [2024-11-30 00:05:48.104912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.104939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.105006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.105026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.105092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.105109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.105176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff0fff cdw11:ffff0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.105192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.770 #57 NEW cov: 11839 ft: 14994 corp: 26/792b lim: 45 exec/s: 57 rss: 70Mb L: 36/44 MS: 1 ChangeByte- 00:07:22.770 [2024-11-30 00:05:48.144872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.144900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.144969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.144988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.145052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.145069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.770 #58 NEW cov: 11839 ft: 15041 corp: 27/821b lim: 45 exec/s: 58 rss: 70Mb L: 29/44 MS: 1 PersAutoDict- DE: "\000\002\000\000"- 00:07:22.770 [2024-11-30 00:05:48.185147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.185175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.185243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff2e cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.185263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.185329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.185345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.185412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.185428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.770 #59 NEW cov: 11839 ft: 15049 corp: 28/857b lim: 45 exec/s: 59 rss: 70Mb L: 36/44 MS: 1 InsertByte- 00:07:22.770 [2024-11-30 00:05:48.224770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:25020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.224798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.770 #60 NEW cov: 11839 ft: 15066 corp: 29/867b lim: 45 exec/s: 60 rss: 70Mb L: 10/44 MS: 1 InsertByte- 00:07:22.770 [2024-11-30 00:05:48.265367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.265394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.265464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.265490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.265555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffff26 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.265571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.265646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.265666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.770 #61 NEW cov: 11839 ft: 15088 corp: 30/908b lim: 45 exec/s: 61 rss: 70Mb L: 41/44 MS: 1 CopyPart- 00:07:22.770 [2024-11-30 00:05:48.305494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.305521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.305589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.305614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.305681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.305701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.770 [2024-11-30 00:05:48.305766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.770 [2024-11-30 00:05:48.305782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.030 #62 NEW cov: 11839 ft: 15111 corp: 31/947b lim: 45 exec/s: 62 rss: 70Mb L: 39/44 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:23.030 [2024-11-30 00:05:48.345497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:02ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.345523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.345591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.345614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.345681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.345698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.030 #63 NEW cov: 11839 ft: 15126 corp: 32/980b lim: 45 exec/s: 63 rss: 70Mb L: 33/44 MS: 1 CrossOver- 00:07:23.030 [2024-11-30 00:05:48.385758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.385784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.385852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.385876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.385941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.385958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.386021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.386036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.030 #64 NEW cov: 11839 ft: 15155 corp: 33/1022b lim: 45 exec/s: 64 rss: 70Mb L: 42/44 MS: 1 ShuffleBytes- 00:07:23.030 [2024-11-30 00:05:48.425877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.425903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.425972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.425991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.426061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.426079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.426145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffdfff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.426164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.030 #65 NEW cov: 11839 ft: 15163 corp: 34/1066b lim: 45 exec/s: 65 rss: 70Mb L: 44/44 MS: 1 ChangeBit- 00:07:23.030 [2024-11-30 00:05:48.465843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.465869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.465938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.465958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.466023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:f346097e cdw11:94000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.466041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.506110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.506136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.506205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.506228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.506295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff09ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.506311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.506375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:46947ef3 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.506391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.030 #67 NEW cov: 11839 ft: 15171 corp: 35/1109b lim: 45 exec/s: 67 rss: 70Mb L: 43/44 MS: 2 CMP-PersAutoDict- DE: "Sz\011~\363F\224\000"-"\377\377\377\377\377\377\377\377"- 00:07:23.030 [2024-11-30 00:05:48.546233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.546260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.546329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.546348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.546414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.546430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.030 [2024-11-30 00:05:48.546494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:feff0000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.030 [2024-11-30 00:05:48.546511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.030 #68 NEW cov: 11839 ft: 15180 corp: 36/1148b lim: 45 exec/s: 68 rss: 70Mb L: 39/44 MS: 1 ChangeBinInt- 00:07:23.290 [2024-11-30 00:05:48.586369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.586396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.586466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0000ff07 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.586484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.586551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fffffeff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.586568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.586641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffff0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.586659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.290 #70 NEW cov: 11839 ft: 15184 corp: 37/1185b lim: 45 exec/s: 70 rss: 70Mb L: 37/44 MS: 2 CopyPart-CrossOver- 00:07:23.290 [2024-11-30 00:05:48.625983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.626012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 #71 NEW cov: 11839 ft: 15198 corp: 38/1195b lim: 45 exec/s: 71 rss: 70Mb L: 10/44 MS: 1 ShuffleBytes- 00:07:23.290 [2024-11-30 00:05:48.666403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.666430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.666497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.666517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.666582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.666602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.290 #72 NEW cov: 11839 ft: 15204 corp: 39/1228b lim: 45 exec/s: 72 rss: 70Mb L: 33/44 MS: 1 EraseBytes- 00:07:23.290 [2024-11-30 00:05:48.706201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff27 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.706227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 #73 NEW cov: 11839 ft: 15210 corp: 40/1237b lim: 45 exec/s: 73 rss: 70Mb L: 9/44 MS: 1 ShuffleBytes- 00:07:23.290 [2024-11-30 00:05:48.746794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.746820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.746905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000700 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.746925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.746992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.747008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.747073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.747089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.290 #74 NEW cov: 11839 ft: 15212 corp: 41/1281b lim: 45 exec/s: 74 rss: 70Mb L: 44/44 MS: 1 ChangeBit- 00:07:23.290 [2024-11-30 00:05:48.787105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.787131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.787202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0000ff07 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.787222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.787292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.787309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.787373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.787389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.787455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.787470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.290 #75 NEW cov: 11839 ft: 15289 corp: 42/1326b lim: 45 exec/s: 75 rss: 70Mb L: 45/45 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:23.290 [2024-11-30 00:05:48.826886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.826913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.826980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.827001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.290 [2024-11-30 00:05:48.827066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.290 [2024-11-30 00:05:48.827082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.550 #76 NEW cov: 11839 ft: 15291 corp: 43/1361b lim: 45 exec/s: 38 rss: 70Mb L: 35/45 MS: 1 CrossOver- 00:07:23.550 #76 DONE cov: 11839 ft: 15291 corp: 43/1361b lim: 45 exec/s: 38 rss: 70Mb 00:07:23.550 ###### Recommended dictionary. ###### 00:07:23.550 "\000\002\000\000" # Uses: 3 00:07:23.550 "\377\377\377\377\377\377\377\377" # Uses: 4 00:07:23.550 "Sz\011~\363F\224\000" # Uses: 0 00:07:23.550 ###### End of recommended dictionary. ###### 00:07:23.550 Done 76 runs in 2 second(s) 00:07:23.550 00:05:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:23.550 00:05:48 -- ../common.sh@72 -- # (( i++ )) 00:07:23.550 00:05:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:23.550 00:05:48 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:23.550 00:05:48 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:23.550 00:05:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:23.550 00:05:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:23.550 00:05:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:23.550 00:05:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:23.550 00:05:48 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:23.550 00:05:48 -- nvmf/run.sh@29 -- # port=4406 00:07:23.550 00:05:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:23.550 00:05:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:23.550 00:05:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:23.550 00:05:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:23.550 [2024-11-30 00:05:49.014267] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:23.550 [2024-11-30 00:05:49.014336] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2723071 ] 00:07:23.550 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.829 [2024-11-30 00:05:49.275501] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.829 [2024-11-30 00:05:49.353246] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:23.829 [2024-11-30 00:05:49.353366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.097 [2024-11-30 00:05:49.411970] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.097 [2024-11-30 00:05:49.428346] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:24.097 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.097 INFO: Seed: 3261187536 00:07:24.097 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:24.097 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:24.097 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:24.097 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.097 #2 INITED exec/s: 0 rss: 60Mb 00:07:24.097 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.097 This may also happen if the target rejected all inputs we tried so far 00:07:24.097 [2024-11-30 00:05:49.504987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:24.097 [2024-11-30 00:05:49.505026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.356 NEW_FUNC[1/669]: 0x445308 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:24.356 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.356 #3 NEW cov: 11523 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:24.356 [2024-11-30 00:05:49.835477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000430a cdw11:00000000 00:07:24.356 [2024-11-30 00:05:49.835526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.356 #6 NEW cov: 11642 ft: 12157 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 3 CopyPart-CopyPart-InsertByte- 00:07:24.356 [2024-11-30 00:05:49.875332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000040cc cdw11:00000000 00:07:24.356 [2024-11-30 00:05:49.875361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.356 #10 NEW cov: 11648 ft: 12425 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 4 ChangeBinInt-ChangeBit-ChangeByte-InsertByte- 00:07:24.618 [2024-11-30 00:05:49.915489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b07a cdw11:00000000 00:07:24.618 [2024-11-30 00:05:49.915517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.618 #11 NEW cov: 11733 ft: 12631 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:07:24.618 [2024-11-30 00:05:49.955579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:24.618 [2024-11-30 00:05:49.955609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.618 #12 NEW cov: 11733 ft: 12736 corp: 6/11b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:07:24.618 [2024-11-30 00:05:49.995765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b07a cdw11:00000000 00:07:24.618 [2024-11-30 00:05:49.995798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.618 #13 NEW cov: 11733 ft: 12795 corp: 7/13b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:24.618 [2024-11-30 00:05:50.035980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:24.618 [2024-11-30 00:05:50.036007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.618 #14 NEW cov: 11733 ft: 12884 corp: 8/15b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:24.618 [2024-11-30 00:05:50.076078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:24.618 [2024-11-30 00:05:50.076106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.618 #16 NEW cov: 11733 ft: 12964 corp: 9/18b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 2 ChangeBit-CrossOver- 00:07:24.618 [2024-11-30 00:05:50.116176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:24.618 [2024-11-30 00:05:50.116203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.618 #17 NEW cov: 11733 ft: 13068 corp: 10/20b lim: 10 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 EraseBytes- 00:07:24.618 [2024-11-30 00:05:50.156318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003b7a cdw11:00000000 00:07:24.618 [2024-11-30 00:05:50.156346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.877 #18 NEW cov: 11733 ft: 13105 corp: 11/22b lim: 10 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeByte- 00:07:24.877 [2024-11-30 00:05:50.196434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f80a cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.196462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.877 #23 NEW cov: 11733 ft: 13130 corp: 12/25b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 5 CrossOver-ShuffleBytes-ChangeByte-ShuffleBytes-CrossOver- 00:07:24.877 [2024-11-30 00:05:50.226920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.226946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.877 [2024-11-30 00:05:50.227063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.227081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.877 [2024-11-30 00:05:50.227193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffcc cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.227210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.877 #24 NEW cov: 11733 ft: 13476 corp: 13/31b lim: 10 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:24.877 [2024-11-30 00:05:50.267076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.267102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.877 [2024-11-30 00:05:50.267224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.267242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.877 [2024-11-30 00:05:50.267354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.267374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.877 #26 NEW cov: 11733 ft: 13485 corp: 14/38b lim: 10 exec/s: 0 rss: 68Mb L: 7/7 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:24.877 [2024-11-30 00:05:50.307035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f80a cdw11:00000000 00:07:24.877 [2024-11-30 00:05:50.307061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.878 [2024-11-30 00:05:50.307179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a2c cdw11:00000000 00:07:24.878 [2024-11-30 00:05:50.307197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.878 #27 NEW cov: 11733 ft: 13659 corp: 15/42b lim: 10 exec/s: 0 rss: 68Mb L: 4/7 MS: 1 InsertByte- 00:07:24.878 [2024-11-30 00:05:50.347161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f80a cdw11:00000000 00:07:24.878 [2024-11-30 00:05:50.347188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.878 [2024-11-30 00:05:50.347300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007aff cdw11:00000000 00:07:24.878 [2024-11-30 00:05:50.347317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.878 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:24.878 #28 NEW cov: 11756 ft: 13707 corp: 16/47b lim: 10 exec/s: 0 rss: 69Mb L: 5/7 MS: 1 InsertByte- 00:07:24.878 [2024-11-30 00:05:50.397201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:24.878 [2024-11-30 00:05:50.397227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.878 [2024-11-30 00:05:50.397343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.878 [2024-11-30 00:05:50.397360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.878 [2024-11-30 00:05:50.397479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.878 [2024-11-30 00:05:50.397497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.878 #29 NEW cov: 11756 ft: 13728 corp: 17/54b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:07:25.137 [2024-11-30 00:05:50.437142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f87a cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.437170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 #30 NEW cov: 11756 ft: 13741 corp: 18/57b lim: 10 exec/s: 0 rss: 69Mb L: 3/7 MS: 1 EraseBytes- 00:07:25.137 [2024-11-30 00:05:50.477706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.477735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.477847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.477865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.477983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.478000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.137 #31 NEW cov: 11756 ft: 13766 corp: 19/64b lim: 10 exec/s: 31 rss: 69Mb L: 7/7 MS: 1 ShuffleBytes- 00:07:25.137 [2024-11-30 00:05:50.527548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f80a cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.527575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.527665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000096ff cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.527681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.137 #32 NEW cov: 11756 ft: 13805 corp: 20/69b lim: 10 exec/s: 32 rss: 70Mb L: 5/7 MS: 1 ChangeByte- 00:07:25.137 [2024-11-30 00:05:50.567973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.567999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.568117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.568133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.568252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.568268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.137 #33 NEW cov: 11756 ft: 13814 corp: 21/76b lim: 10 exec/s: 33 rss: 70Mb L: 7/7 MS: 1 ShuffleBytes- 00:07:25.137 [2024-11-30 00:05:50.607999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.608025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.608136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.608153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.608259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000020 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.608278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.137 #34 NEW cov: 11756 ft: 13847 corp: 22/83b lim: 10 exec/s: 34 rss: 70Mb L: 7/7 MS: 1 ChangeBit- 00:07:25.137 [2024-11-30 00:05:50.648201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a8c cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.648228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.648340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.648358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.648468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.648487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.137 #35 NEW cov: 11756 ft: 13913 corp: 23/90b lim: 10 exec/s: 35 rss: 70Mb L: 7/7 MS: 1 ChangeBit- 00:07:25.137 [2024-11-30 00:05:50.688336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.688366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.688475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.688490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.137 [2024-11-30 00:05:50.688609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.137 [2024-11-30 00:05:50.688625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.396 #36 NEW cov: 11756 ft: 13942 corp: 24/97b lim: 10 exec/s: 36 rss: 70Mb L: 7/7 MS: 1 ChangeByte- 00:07:25.396 [2024-11-30 00:05:50.728116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.396 [2024-11-30 00:05:50.728143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.396 #37 NEW cov: 11756 ft: 13952 corp: 25/99b lim: 10 exec/s: 37 rss: 70Mb L: 2/7 MS: 1 ShuffleBytes- 00:07:25.396 [2024-11-30 00:05:50.768178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:25.396 [2024-11-30 00:05:50.768204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.396 #38 NEW cov: 11756 ft: 13966 corp: 26/102b lim: 10 exec/s: 38 rss: 70Mb L: 3/7 MS: 1 CrossOver- 00:07:25.396 [2024-11-30 00:05:50.808438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:07:25.396 [2024-11-30 00:05:50.808465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.396 [2024-11-30 00:05:50.808570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.396 [2024-11-30 00:05:50.808586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.396 #39 NEW cov: 11756 ft: 13974 corp: 27/107b lim: 10 exec/s: 39 rss: 70Mb L: 5/7 MS: 1 EraseBytes- 00:07:25.396 [2024-11-30 00:05:50.848586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:25.396 [2024-11-30 00:05:50.848615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.396 [2024-11-30 00:05:50.848735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.396 [2024-11-30 00:05:50.848757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.397 #40 NEW cov: 11756 ft: 13987 corp: 28/111b lim: 10 exec/s: 40 rss: 70Mb L: 4/7 MS: 1 CrossOver- 00:07:25.397 [2024-11-30 00:05:50.888516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:25.397 [2024-11-30 00:05:50.888543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.397 #41 NEW cov: 11756 ft: 14007 corp: 29/113b lim: 10 exec/s: 41 rss: 70Mb L: 2/7 MS: 1 ChangeByte- 00:07:25.397 [2024-11-30 00:05:50.929025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000acc cdw11:00000000 00:07:25.397 [2024-11-30 00:05:50.929052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.397 [2024-11-30 00:05:50.929165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.397 [2024-11-30 00:05:50.929183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.397 [2024-11-30 00:05:50.929303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002000 cdw11:00000000 00:07:25.397 [2024-11-30 00:05:50.929319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.397 #42 NEW cov: 11756 ft: 14009 corp: 30/120b lim: 10 exec/s: 42 rss: 70Mb L: 7/7 MS: 1 ChangeBit- 00:07:25.656 [2024-11-30 00:05:50.968724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:25.656 [2024-11-30 00:05:50.968752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.656 #43 NEW cov: 11756 ft: 14059 corp: 31/123b lim: 10 exec/s: 43 rss: 70Mb L: 3/7 MS: 1 CrossOver- 00:07:25.656 [2024-11-30 00:05:51.008829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a840 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.008857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.656 #44 NEW cov: 11756 ft: 14069 corp: 32/126b lim: 10 exec/s: 44 rss: 70Mb L: 3/7 MS: 1 InsertByte- 00:07:25.656 [2024-11-30 00:05:51.049041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b07a cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.049069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.656 #45 NEW cov: 11756 ft: 14076 corp: 33/128b lim: 10 exec/s: 45 rss: 70Mb L: 2/7 MS: 1 ShuffleBytes- 00:07:25.656 [2024-11-30 00:05:51.089732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.089760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.656 [2024-11-30 00:05:51.089877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007ad6 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.089894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.656 [2024-11-30 00:05:51.090005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d6d6 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.090022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.656 [2024-11-30 00:05:51.090129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d6d6 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.090144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.656 #46 NEW cov: 11756 ft: 14281 corp: 34/137b lim: 10 exec/s: 46 rss: 70Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:25.656 [2024-11-30 00:05:51.139918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.139945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.656 [2024-11-30 00:05:51.140057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007ad6 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.140076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.656 [2024-11-30 00:05:51.140186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d6d6 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.140203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.656 [2024-11-30 00:05:51.140293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d6d6 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.140313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.656 #47 NEW cov: 11756 ft: 14292 corp: 35/146b lim: 10 exec/s: 47 rss: 70Mb L: 9/9 MS: 1 CrossOver- 00:07:25.656 [2024-11-30 00:05:51.189464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005087 cdw11:00000000 00:07:25.656 [2024-11-30 00:05:51.189492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.916 #48 NEW cov: 11756 ft: 14293 corp: 36/148b lim: 10 exec/s: 48 rss: 70Mb L: 2/9 MS: 1 ChangeBinInt- 00:07:25.916 [2024-11-30 00:05:51.229989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ecff cdw11:00000000 00:07:25.916 [2024-11-30 00:05:51.230019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.916 [2024-11-30 00:05:51.230126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.916 [2024-11-30 00:05:51.230145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.916 [2024-11-30 00:05:51.230261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.916 [2024-11-30 00:05:51.230279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.916 #51 NEW cov: 11756 ft: 14301 corp: 37/155b lim: 10 exec/s: 51 rss: 70Mb L: 7/9 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:07:25.916 [2024-11-30 00:05:51.280170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.916 [2024-11-30 00:05:51.280198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.916 [2024-11-30 00:05:51.280313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.916 [2024-11-30 00:05:51.280331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.280451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000430a cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.280467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.917 #52 NEW cov: 11756 ft: 14305 corp: 38/161b lim: 10 exec/s: 52 rss: 70Mb L: 6/9 MS: 1 InsertRepeatedBytes- 00:07:25.917 [2024-11-30 00:05:51.330424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.330451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.330566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000045e1 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.330586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.330714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.330731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.330846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e140 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.330863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.917 #53 NEW cov: 11756 ft: 14310 corp: 39/169b lim: 10 exec/s: 53 rss: 70Mb L: 8/9 MS: 1 InsertByte- 00:07:25.917 [2024-11-30 00:05:51.380442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.380469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.380588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007ad6 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.380610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.380725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d6d6 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.380743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.917 #54 NEW cov: 11756 ft: 14329 corp: 40/176b lim: 10 exec/s: 54 rss: 70Mb L: 7/9 MS: 1 EraseBytes- 00:07:25.917 [2024-11-30 00:05:51.430943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ef cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.430969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.431079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.431097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.431199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ef0a cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.431215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.431327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.431344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.431455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffcc cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.431471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:25.917 #55 NEW cov: 11756 ft: 14371 corp: 41/186b lim: 10 exec/s: 55 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:25.917 [2024-11-30 00:05:51.471183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.471209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.471330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.471347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.471458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffcc cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.471478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.471586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.471607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.917 [2024-11-30 00:05:51.471721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.917 [2024-11-30 00:05:51.471741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.175 #56 NEW cov: 11756 ft: 14380 corp: 42/196b lim: 10 exec/s: 28 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:26.175 #56 DONE cov: 11756 ft: 14380 corp: 42/196b lim: 10 exec/s: 28 rss: 70Mb 00:07:26.175 Done 56 runs in 2 second(s) 00:07:26.175 00:05:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:26.175 00:05:51 -- ../common.sh@72 -- # (( i++ )) 00:07:26.175 00:05:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.175 00:05:51 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:26.176 00:05:51 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:26.176 00:05:51 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.176 00:05:51 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.176 00:05:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:26.176 00:05:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:26.176 00:05:51 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:26.176 00:05:51 -- nvmf/run.sh@29 -- # port=4407 00:07:26.176 00:05:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:26.176 00:05:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:26.176 00:05:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.176 00:05:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:26.176 [2024-11-30 00:05:51.660000] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:26.176 [2024-11-30 00:05:51.660079] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2723620 ] 00:07:26.176 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.435 [2024-11-30 00:05:51.911672] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.695 [2024-11-30 00:05:51.996660] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.695 [2024-11-30 00:05:51.996782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.695 [2024-11-30 00:05:52.055083] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:26.695 [2024-11-30 00:05:52.071446] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:26.695 INFO: Running with entropic power schedule (0xFF, 100). 00:07:26.695 INFO: Seed: 1608228104 00:07:26.695 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:26.695 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:26.695 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:26.695 INFO: A corpus is not provided, starting from an empty corpus 00:07:26.695 #2 INITED exec/s: 0 rss: 60Mb 00:07:26.695 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:26.695 This may also happen if the target rejected all inputs we tried so far 00:07:26.695 [2024-11-30 00:05:52.116576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:26.695 [2024-11-30 00:05:52.116607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.953 NEW_FUNC[1/669]: 0x445d08 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:26.953 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:26.953 #3 NEW cov: 11525 ft: 11526 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:26.953 [2024-11-30 00:05:52.437408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:26.953 [2024-11-30 00:05:52.437438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.953 #4 NEW cov: 11642 ft: 11849 corp: 3/6b lim: 10 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:07:26.953 [2024-11-30 00:05:52.477850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:26.953 [2024-11-30 00:05:52.477878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.953 [2024-11-30 00:05:52.477932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:26.953 [2024-11-30 00:05:52.477947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.954 [2024-11-30 00:05:52.478001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:26.954 [2024-11-30 00:05:52.478015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.954 [2024-11-30 00:05:52.478067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:26.954 [2024-11-30 00:05:52.478081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.954 #6 NEW cov: 11648 ft: 12384 corp: 4/15b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:27.212 [2024-11-30 00:05:52.517666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.517692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.212 [2024-11-30 00:05:52.517745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.517758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.212 #7 NEW cov: 11733 ft: 12879 corp: 5/19b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 InsertByte- 00:07:27.212 [2024-11-30 00:05:52.557797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.557823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.212 [2024-11-30 00:05:52.557879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b35f cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.557893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.212 #8 NEW cov: 11733 ft: 13008 corp: 6/23b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 ChangeBit- 00:07:27.212 [2024-11-30 00:05:52.597797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.597822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.212 #9 NEW cov: 11733 ft: 13157 corp: 7/25b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 CopyPart- 00:07:27.212 [2024-11-30 00:05:52.637892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.637918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.212 #10 NEW cov: 11733 ft: 13259 corp: 8/28b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 CrossOver- 00:07:27.212 [2024-11-30 00:05:52.678430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff7a cdw11:00000000 00:07:27.212 [2024-11-30 00:05:52.678459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.213 [2024-11-30 00:05:52.678513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.213 [2024-11-30 00:05:52.678526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.213 [2024-11-30 00:05:52.678578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.213 [2024-11-30 00:05:52.678591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.213 [2024-11-30 00:05:52.678665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.213 [2024-11-30 00:05:52.678679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.213 #11 NEW cov: 11733 ft: 13360 corp: 9/37b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ChangeByte- 00:07:27.213 [2024-11-30 00:05:52.718147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000230a cdw11:00000000 00:07:27.213 [2024-11-30 00:05:52.718173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.213 #12 NEW cov: 11733 ft: 13375 corp: 10/39b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 InsertByte- 00:07:27.213 [2024-11-30 00:05:52.748337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.213 [2024-11-30 00:05:52.748362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.213 [2024-11-30 00:05:52.748416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.213 [2024-11-30 00:05:52.748430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.472 #13 NEW cov: 11733 ft: 13453 corp: 11/43b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 ChangeBit- 00:07:27.472 [2024-11-30 00:05:52.788477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000720a cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.788503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.472 [2024-11-30 00:05:52.788556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.788569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.472 #14 NEW cov: 11733 ft: 13545 corp: 12/47b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 ChangeBit- 00:07:27.472 [2024-11-30 00:05:52.828614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.828639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.472 [2024-11-30 00:05:52.828692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.828705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.472 #15 NEW cov: 11733 ft: 13569 corp: 13/51b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 ChangeByte- 00:07:27.472 [2024-11-30 00:05:52.868710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007207 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.868735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.472 [2024-11-30 00:05:52.868789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.868803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.472 #16 NEW cov: 11733 ft: 13677 corp: 14/55b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 ChangeByte- 00:07:27.472 [2024-11-30 00:05:52.909070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.909095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.472 [2024-11-30 00:05:52.909149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.909163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.472 [2024-11-30 00:05:52.909215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007207 cdw11:00000000 00:07:27.472 [2024-11-30 00:05:52.909229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.473 [2024-11-30 00:05:52.909282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.473 [2024-11-30 00:05:52.909295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.473 #17 NEW cov: 11733 ft: 13771 corp: 15/63b lim: 10 exec/s: 0 rss: 69Mb L: 8/9 MS: 1 InsertRepeatedBytes- 00:07:27.473 [2024-11-30 00:05:52.948824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aa3 cdw11:00000000 00:07:27.473 [2024-11-30 00:05:52.948850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.473 #18 NEW cov: 11733 ft: 13796 corp: 16/65b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 ChangeBit- 00:07:27.473 [2024-11-30 00:05:52.989189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:27.473 [2024-11-30 00:05:52.989215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.473 [2024-11-30 00:05:52.989275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.473 [2024-11-30 00:05:52.989295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.473 [2024-11-30 00:05:52.989349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.473 [2024-11-30 00:05:52.989365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.473 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:27.473 #19 NEW cov: 11756 ft: 13991 corp: 17/72b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 InsertRepeatedBytes- 00:07:27.732 [2024-11-30 00:05:53.029446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.029471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.029523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.029536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.029591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.029612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.029662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.029693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.732 #20 NEW cov: 11756 ft: 14076 corp: 18/80b lim: 10 exec/s: 0 rss: 70Mb L: 8/9 MS: 1 CopyPart- 00:07:27.732 [2024-11-30 00:05:53.069430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.069454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.069507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005733 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.069521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.069573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.069586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.732 #21 NEW cov: 11756 ft: 14087 corp: 19/87b lim: 10 exec/s: 0 rss: 70Mb L: 7/9 MS: 1 EraseBytes- 00:07:27.732 [2024-11-30 00:05:53.109696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.109721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.109778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.109791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.109843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.109856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.109910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b335 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.109923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.732 #22 NEW cov: 11756 ft: 14124 corp: 20/96b lim: 10 exec/s: 22 rss: 70Mb L: 9/9 MS: 1 InsertByte- 00:07:27.732 [2024-11-30 00:05:53.149520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003260 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.149544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.732 [2024-11-30 00:05:53.149602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:27.732 [2024-11-30 00:05:53.149616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.732 #23 NEW cov: 11756 ft: 14225 corp: 21/100b lim: 10 exec/s: 23 rss: 70Mb L: 4/9 MS: 1 InsertByte- 00:07:27.733 [2024-11-30 00:05:53.190026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.190050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.733 [2024-11-30 00:05:53.190104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000033b3 cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.190120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.733 [2024-11-30 00:05:53.190171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005733 cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.190183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.733 [2024-11-30 00:05:53.190236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.190249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.733 [2024-11-30 00:05:53.190298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00003557 cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.190312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.733 #24 NEW cov: 11756 ft: 14283 corp: 22/110b lim: 10 exec/s: 24 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:27.733 [2024-11-30 00:05:53.229680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.229705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.733 #25 NEW cov: 11756 ft: 14292 corp: 23/113b lim: 10 exec/s: 25 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:27.733 [2024-11-30 00:05:53.269868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007207 cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.269894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.733 [2024-11-30 00:05:53.269954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b30a cdw11:00000000 00:07:27.733 [2024-11-30 00:05:53.269973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 #26 NEW cov: 11756 ft: 14316 corp: 24/118b lim: 10 exec/s: 26 rss: 70Mb L: 5/10 MS: 1 CrossOver- 00:07:27.992 [2024-11-30 00:05:53.309967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002304 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.309993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.310048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.310062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 #28 NEW cov: 11756 ft: 14331 corp: 25/123b lim: 10 exec/s: 28 rss: 70Mb L: 5/10 MS: 2 EraseBytes-CMP- DE: "\004\000\000\000"- 00:07:27.992 [2024-11-30 00:05:53.350396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.350420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.350473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005733 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.350486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.350537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000ab3 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.350550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.350606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000057b3 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.350623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.992 #29 NEW cov: 11756 ft: 14344 corp: 26/132b lim: 10 exec/s: 29 rss: 70Mb L: 9/10 MS: 1 CopyPart- 00:07:27.992 [2024-11-30 00:05:53.390250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000032b3 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.390274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.390326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.390340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 #30 NEW cov: 11756 ft: 14353 corp: 27/136b lim: 10 exec/s: 30 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:27.992 [2024-11-30 00:05:53.420549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.420574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.420624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.420637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.420690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007207 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.420704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.420755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.420768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.992 #31 NEW cov: 11756 ft: 14384 corp: 28/144b lim: 10 exec/s: 31 rss: 70Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:27.992 [2024-11-30 00:05:53.460650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.460675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.460729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.460742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.460796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000330a cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.460809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.460862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b334 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.460876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.992 #32 NEW cov: 11756 ft: 14406 corp: 29/153b lim: 10 exec/s: 32 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:27.992 [2024-11-30 00:05:53.500484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007d0a cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.500509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.992 [2024-11-30 00:05:53.500564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b35f cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.500580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.992 #33 NEW cov: 11756 ft: 14422 corp: 30/157b lim: 10 exec/s: 33 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:07:27.992 [2024-11-30 00:05:53.540619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003260 cdw11:00000000 00:07:27.992 [2024-11-30 00:05:53.540644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.993 [2024-11-30 00:05:53.540707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a32 cdw11:00000000 00:07:27.993 [2024-11-30 00:05:53.540723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.252 #34 NEW cov: 11756 ft: 14429 corp: 31/161b lim: 10 exec/s: 34 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:28.252 [2024-11-30 00:05:53.581117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000330a cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.581142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.581196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.581210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.581263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000330a cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.581277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.581330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b334 cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.581344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.581397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00005700 cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.581410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:28.252 #35 NEW cov: 11756 ft: 14450 corp: 32/171b lim: 10 exec/s: 35 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:07:28.252 [2024-11-30 00:05:53.620873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007d0a cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.620898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.620954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.620967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.252 #36 NEW cov: 11756 ft: 14463 corp: 33/175b lim: 10 exec/s: 36 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:07:28.252 [2024-11-30 00:05:53.661348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000093ff cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.661372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.661427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.661442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.252 [2024-11-30 00:05:53.661496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.252 [2024-11-30 00:05:53.661512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.253 [2024-11-30 00:05:53.661563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.661576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.253 [2024-11-30 00:05:53.661634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.661648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:28.253 #40 NEW cov: 11756 ft: 14540 corp: 34/185b lim: 10 exec/s: 40 rss: 70Mb L: 10/10 MS: 4 ChangeBit-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:28.253 [2024-11-30 00:05:53.701070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007d0a cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.701095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.253 [2024-11-30 00:05:53.701148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b35f cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.701161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.253 #41 NEW cov: 11756 ft: 14594 corp: 35/190b lim: 10 exec/s: 41 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:07:28.253 [2024-11-30 00:05:53.741193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.741217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.253 [2024-11-30 00:05:53.741272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000600a cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.741285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.253 #42 NEW cov: 11756 ft: 14603 corp: 36/195b lim: 10 exec/s: 42 rss: 70Mb L: 5/10 MS: 1 CrossOver- 00:07:28.253 [2024-11-30 00:05:53.781288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007d0a cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.781313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.253 [2024-11-30 00:05:53.781365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b35f cdw11:00000000 00:07:28.253 [2024-11-30 00:05:53.781379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 #43 NEW cov: 11756 ft: 14605 corp: 37/200b lim: 10 exec/s: 43 rss: 70Mb L: 5/10 MS: 1 ChangeBit- 00:07:28.512 [2024-11-30 00:05:53.821424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000400 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.821448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.821511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.821529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 #44 NEW cov: 11756 ft: 14613 corp: 38/204b lim: 10 exec/s: 44 rss: 70Mb L: 4/10 MS: 1 PersAutoDict- DE: "\004\000\000\000"- 00:07:28.512 [2024-11-30 00:05:53.861801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.861826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.861885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.861899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.861952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002304 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.861965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.862018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.862032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.512 #45 NEW cov: 11756 ft: 14666 corp: 39/213b lim: 10 exec/s: 45 rss: 70Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:28.512 [2024-11-30 00:05:53.901687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007d0a cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.901711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.901766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007a57 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.901779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 #46 NEW cov: 11756 ft: 14679 corp: 40/217b lim: 10 exec/s: 46 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:07:28.512 [2024-11-30 00:05:53.941794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002323 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.941818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.941873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.941886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 #47 NEW cov: 11756 ft: 14686 corp: 41/221b lim: 10 exec/s: 47 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:28.512 [2024-11-30 00:05:53.982215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000093ff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.982241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.982296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.982310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.982364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.982378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.982433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.982446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:53.982499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000c0ff cdw11:00000000 00:07:28.512 [2024-11-30 00:05:53.982512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:28.512 #48 NEW cov: 11756 ft: 14716 corp: 42/231b lim: 10 exec/s: 48 rss: 70Mb L: 10/10 MS: 1 ChangeByte- 00:07:28.512 [2024-11-30 00:05:54.022034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000720a cdw11:00000000 00:07:28.512 [2024-11-30 00:05:54.022060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.512 [2024-11-30 00:05:54.022116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b372 cdw11:00000000 00:07:28.512 [2024-11-30 00:05:54.022131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.512 #49 NEW cov: 11756 ft: 14726 corp: 43/235b lim: 10 exec/s: 49 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:28.512 [2024-11-30 00:05:54.052385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000730a cdw11:00000000 00:07:28.512 [2024-11-30 00:05:54.052411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.513 [2024-11-30 00:05:54.052465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b357 cdw11:00000000 00:07:28.513 [2024-11-30 00:05:54.052479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.513 [2024-11-30 00:05:54.052533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000330a cdw11:00000000 00:07:28.513 [2024-11-30 00:05:54.052548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.513 [2024-11-30 00:05:54.052605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b334 cdw11:00000000 00:07:28.513 [2024-11-30 00:05:54.052620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.773 #50 NEW cov: 11756 ft: 14737 corp: 44/244b lim: 10 exec/s: 50 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:28.773 [2024-11-30 00:05:54.092100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a32 cdw11:00000000 00:07:28.773 [2024-11-30 00:05:54.092125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.773 #51 NEW cov: 11756 ft: 14757 corp: 45/247b lim: 10 exec/s: 25 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:28.773 #51 DONE cov: 11756 ft: 14757 corp: 45/247b lim: 10 exec/s: 25 rss: 70Mb 00:07:28.773 ###### Recommended dictionary. ###### 00:07:28.773 "\004\000\000\000" # Uses: 1 00:07:28.773 ###### End of recommended dictionary. ###### 00:07:28.773 Done 51 runs in 2 second(s) 00:07:28.773 00:05:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:28.773 00:05:54 -- ../common.sh@72 -- # (( i++ )) 00:07:28.773 00:05:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.773 00:05:54 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:28.773 00:05:54 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:28.773 00:05:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.773 00:05:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.773 00:05:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:28.773 00:05:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:28.773 00:05:54 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:28.773 00:05:54 -- nvmf/run.sh@29 -- # port=4408 00:07:28.773 00:05:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:28.773 00:05:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:28.773 00:05:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.773 00:05:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:28.773 [2024-11-30 00:05:54.284731] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:28.773 [2024-11-30 00:05:54.284803] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2724040 ] 00:07:28.773 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.041 [2024-11-30 00:05:54.536266] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.306 [2024-11-30 00:05:54.623282] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.306 [2024-11-30 00:05:54.623444] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.306 [2024-11-30 00:05:54.681492] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.306 [2024-11-30 00:05:54.697872] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:29.306 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.306 INFO: Seed: 4235225228 00:07:29.306 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:29.306 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:29.306 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:29.306 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.306 [2024-11-30 00:05:54.742957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.306 [2024-11-30 00:05:54.742986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.306 #2 INITED cov: 11552 ft: 11552 corp: 1/1b exec/s: 0 rss: 67Mb 00:07:29.306 [2024-11-30 00:05:54.772905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.306 [2024-11-30 00:05:54.772930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.566 NEW_FUNC[1/1]: 0x16a9038 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:07:29.566 #3 NEW cov: 11670 ft: 11906 corp: 2/2b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:29.566 [2024-11-30 00:05:55.083854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.566 [2024-11-30 00:05:55.083886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.566 #4 NEW cov: 11676 ft: 12166 corp: 3/3b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeBit- 00:07:29.826 [2024-11-30 00:05:55.123853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.826 [2024-11-30 00:05:55.123879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.826 #5 NEW cov: 11761 ft: 12371 corp: 4/4b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeByte- 00:07:29.826 [2024-11-30 00:05:55.164111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.826 [2024-11-30 00:05:55.164136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.826 [2024-11-30 00:05:55.164187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.164201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.827 #6 NEW cov: 11761 ft: 13194 corp: 5/6b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:29.827 [2024-11-30 00:05:55.204059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.204084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.827 #7 NEW cov: 11761 ft: 13294 corp: 6/7b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeBit- 00:07:29.827 [2024-11-30 00:05:55.244151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.244175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.827 #8 NEW cov: 11761 ft: 13338 corp: 7/8b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:29.827 [2024-11-30 00:05:55.284433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.284458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.827 [2024-11-30 00:05:55.284509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.284522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.827 #9 NEW cov: 11761 ft: 13371 corp: 8/10b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:29.827 [2024-11-30 00:05:55.324550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.324575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.827 [2024-11-30 00:05:55.324627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.324641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.827 #10 NEW cov: 11761 ft: 13395 corp: 9/12b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:29.827 [2024-11-30 00:05:55.364692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.827 [2024-11-30 00:05:55.364716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.828 [2024-11-30 00:05:55.364771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.828 [2024-11-30 00:05:55.364784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.092 #11 NEW cov: 11761 ft: 13439 corp: 10/14b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:30.092 [2024-11-30 00:05:55.404786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.092 [2024-11-30 00:05:55.404811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.092 [2024-11-30 00:05:55.404859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.092 [2024-11-30 00:05:55.404872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.092 #12 NEW cov: 11761 ft: 13490 corp: 11/16b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 CrossOver- 00:07:30.092 [2024-11-30 00:05:55.444985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.092 [2024-11-30 00:05:55.445009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.092 [2024-11-30 00:05:55.445060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.092 [2024-11-30 00:05:55.445073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.092 #13 NEW cov: 11761 ft: 13594 corp: 12/18b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 CopyPart- 00:07:30.092 [2024-11-30 00:05:55.485050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.092 [2024-11-30 00:05:55.485074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.092 [2024-11-30 00:05:55.485123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.092 [2024-11-30 00:05:55.485136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.092 #14 NEW cov: 11761 ft: 13630 corp: 13/20b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 InsertByte- 00:07:30.093 [2024-11-30 00:05:55.524996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.093 [2024-11-30 00:05:55.525022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.093 #15 NEW cov: 11761 ft: 13707 corp: 14/21b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:30.093 [2024-11-30 00:05:55.555101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.093 [2024-11-30 00:05:55.555127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.093 #16 NEW cov: 11761 ft: 13721 corp: 15/22b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:30.093 [2024-11-30 00:05:55.585326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.093 [2024-11-30 00:05:55.585352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.093 [2024-11-30 00:05:55.585402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.093 [2024-11-30 00:05:55.585417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.093 #17 NEW cov: 11761 ft: 13759 corp: 16/24b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ChangeByte- 00:07:30.093 [2024-11-30 00:05:55.625293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.093 [2024-11-30 00:05:55.625318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:30.352 #18 NEW cov: 11784 ft: 13841 corp: 17/25b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 EraseBytes- 00:07:30.352 [2024-11-30 00:05:55.665408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.665437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 #19 NEW cov: 11784 ft: 13856 corp: 18/26b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeByte- 00:07:30.352 [2024-11-30 00:05:55.695497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.695523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 #20 NEW cov: 11784 ft: 13866 corp: 19/27b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeBit- 00:07:30.352 [2024-11-30 00:05:55.735642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.735667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 #21 NEW cov: 11784 ft: 13889 corp: 20/28b lim: 5 exec/s: 21 rss: 69Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:30.352 [2024-11-30 00:05:55.776070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.776095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 [2024-11-30 00:05:55.776145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.776158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.352 [2024-11-30 00:05:55.776205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.776218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.352 #22 NEW cov: 11784 ft: 14073 corp: 21/31b lim: 5 exec/s: 22 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:07:30.352 [2024-11-30 00:05:55.816002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.816026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 [2024-11-30 00:05:55.816076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.816090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.352 #23 NEW cov: 11784 ft: 14132 corp: 22/33b lim: 5 exec/s: 23 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:07:30.352 [2024-11-30 00:05:55.856126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.856151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 [2024-11-30 00:05:55.856202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.856215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.352 #24 NEW cov: 11784 ft: 14148 corp: 23/35b lim: 5 exec/s: 24 rss: 69Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:30.352 [2024-11-30 00:05:55.896223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.896252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.352 [2024-11-30 00:05:55.896311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.352 [2024-11-30 00:05:55.896331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.612 #25 NEW cov: 11784 ft: 14197 corp: 24/37b lim: 5 exec/s: 25 rss: 69Mb L: 2/3 MS: 1 ChangeBit- 00:07:30.612 [2024-11-30 00:05:55.936346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:55.936371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:55.936423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:55.936436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.612 #26 NEW cov: 11784 ft: 14239 corp: 25/39b lim: 5 exec/s: 26 rss: 69Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:30.612 [2024-11-30 00:05:55.976465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:55.976490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:55.976544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:55.976557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.612 #27 NEW cov: 11784 ft: 14254 corp: 26/41b lim: 5 exec/s: 27 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:07:30.612 [2024-11-30 00:05:56.016580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.016610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:56.016659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.016673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.612 #28 NEW cov: 11784 ft: 14266 corp: 27/43b lim: 5 exec/s: 28 rss: 69Mb L: 2/3 MS: 1 ChangeBit- 00:07:30.612 [2024-11-30 00:05:56.056691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.056716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:56.056765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.056779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.612 #29 NEW cov: 11784 ft: 14340 corp: 28/45b lim: 5 exec/s: 29 rss: 70Mb L: 2/3 MS: 1 CopyPart- 00:07:30.612 [2024-11-30 00:05:56.097219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.097247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:56.097299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.097313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:56.097362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.097375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:56.097423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.097437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.612 [2024-11-30 00:05:56.097488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.097501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.612 #30 NEW cov: 11784 ft: 14665 corp: 29/50b lim: 5 exec/s: 30 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:30.612 [2024-11-30 00:05:56.146791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.612 [2024-11-30 00:05:56.146816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.872 #31 NEW cov: 11784 ft: 14702 corp: 30/51b lim: 5 exec/s: 31 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:30.872 [2024-11-30 00:05:56.187008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.187034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.872 [2024-11-30 00:05:56.187063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.187076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.872 #32 NEW cov: 11784 ft: 14711 corp: 31/53b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:30.872 [2024-11-30 00:05:56.227110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.227136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.872 [2024-11-30 00:05:56.227188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.227200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.872 #33 NEW cov: 11784 ft: 14767 corp: 32/55b lim: 5 exec/s: 33 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:30.872 [2024-11-30 00:05:56.267390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.267414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.872 [2024-11-30 00:05:56.267464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.267480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.872 [2024-11-30 00:05:56.267529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.267542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.872 #34 NEW cov: 11784 ft: 14781 corp: 33/58b lim: 5 exec/s: 34 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:30.872 [2024-11-30 00:05:56.307509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.872 [2024-11-30 00:05:56.307534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.873 [2024-11-30 00:05:56.307586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.307603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.873 [2024-11-30 00:05:56.307654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.307667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.873 #35 NEW cov: 11784 ft: 14794 corp: 34/61b lim: 5 exec/s: 35 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:30.873 [2024-11-30 00:05:56.347471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.347497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.873 [2024-11-30 00:05:56.347550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.347563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.873 #36 NEW cov: 11784 ft: 14799 corp: 35/63b lim: 5 exec/s: 36 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:30.873 [2024-11-30 00:05:56.387584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.387613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.873 [2024-11-30 00:05:56.387666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.387679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.873 #37 NEW cov: 11784 ft: 14833 corp: 36/65b lim: 5 exec/s: 37 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:30.873 [2024-11-30 00:05:56.427570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.873 [2024-11-30 00:05:56.427595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 #38 NEW cov: 11784 ft: 14838 corp: 37/66b lim: 5 exec/s: 38 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:31.133 [2024-11-30 00:05:56.467823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.467850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.467903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.467916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.507800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.507825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 #40 NEW cov: 11784 ft: 14840 corp: 38/67b lim: 5 exec/s: 40 rss: 70Mb L: 1/5 MS: 2 CrossOver-EraseBytes- 00:07:31.133 [2024-11-30 00:05:56.548075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.548099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.548150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.548163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.133 #41 NEW cov: 11784 ft: 14845 corp: 39/69b lim: 5 exec/s: 41 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:31.133 [2024-11-30 00:05:56.588330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.588354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.588407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.588420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.588468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.588480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.133 #42 NEW cov: 11784 ft: 14858 corp: 40/72b lim: 5 exec/s: 42 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:31.133 [2024-11-30 00:05:56.628439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.628463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.628515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.628529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.628576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.628589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.133 #43 NEW cov: 11784 ft: 14864 corp: 41/75b lim: 5 exec/s: 43 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:07:31.133 [2024-11-30 00:05:56.668445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.668469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.133 [2024-11-30 00:05:56.668519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.133 [2024-11-30 00:05:56.668532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.133 #44 NEW cov: 11784 ft: 14869 corp: 42/77b lim: 5 exec/s: 44 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:31.392 [2024-11-30 00:05:56.708658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.392 [2024-11-30 00:05:56.708682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.392 [2024-11-30 00:05:56.708742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.392 [2024-11-30 00:05:56.708759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.392 [2024-11-30 00:05:56.708825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.392 [2024-11-30 00:05:56.708838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.392 #45 NEW cov: 11784 ft: 14874 corp: 43/80b lim: 5 exec/s: 45 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:31.392 [2024-11-30 00:05:56.748782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.392 [2024-11-30 00:05:56.748807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.392 [2024-11-30 00:05:56.748857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.392 [2024-11-30 00:05:56.748871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.392 [2024-11-30 00:05:56.748918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.392 [2024-11-30 00:05:56.748931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.392 #46 NEW cov: 11784 ft: 14888 corp: 44/83b lim: 5 exec/s: 23 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:31.392 #46 DONE cov: 11784 ft: 14888 corp: 44/83b lim: 5 exec/s: 23 rss: 70Mb 00:07:31.392 Done 46 runs in 2 second(s) 00:07:31.392 00:05:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:31.392 00:05:56 -- ../common.sh@72 -- # (( i++ )) 00:07:31.392 00:05:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:31.392 00:05:56 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:31.392 00:05:56 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:31.392 00:05:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:31.392 00:05:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:31.392 00:05:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:31.392 00:05:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:31.392 00:05:56 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:31.392 00:05:56 -- nvmf/run.sh@29 -- # port=4409 00:07:31.392 00:05:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:31.392 00:05:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:31.392 00:05:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:31.392 00:05:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:31.392 [2024-11-30 00:05:56.941409] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:31.392 [2024-11-30 00:05:56.941480] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2724455 ] 00:07:31.651 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.651 [2024-11-30 00:05:57.194066] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.910 [2024-11-30 00:05:57.279619] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.910 [2024-11-30 00:05:57.279800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.910 [2024-11-30 00:05:57.338168] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:31.910 [2024-11-30 00:05:57.354554] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:31.910 INFO: Running with entropic power schedule (0xFF, 100). 00:07:31.910 INFO: Seed: 2597269335 00:07:31.910 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:31.910 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:31.910 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:31.910 INFO: A corpus is not provided, starting from an empty corpus 00:07:31.910 [2024-11-30 00:05:57.399156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.910 [2024-11-30 00:05:57.399192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.910 #2 INITED cov: 11549 ft: 11558 corp: 1/1b exec/s: 0 rss: 67Mb 00:07:31.910 [2024-11-30 00:05:57.449083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.910 [2024-11-30 00:05:57.449115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.428 NEW_FUNC[1/1]: 0x1277658 in nvmf_transport_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:723 00:07:32.428 #3 NEW cov: 11670 ft: 12284 corp: 2/2b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeByte- 00:07:32.428 [2024-11-30 00:05:57.770088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.428 [2024-11-30 00:05:57.770126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.428 [2024-11-30 00:05:57.770158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.428 [2024-11-30 00:05:57.770174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.428 #4 NEW cov: 11676 ft: 13205 corp: 3/4b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:32.428 [2024-11-30 00:05:57.830077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.428 [2024-11-30 00:05:57.830108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.428 #5 NEW cov: 11761 ft: 13512 corp: 4/5b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 CrossOver- 00:07:32.428 [2024-11-30 00:05:57.900248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.428 [2024-11-30 00:05:57.900278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.428 #6 NEW cov: 11761 ft: 13550 corp: 5/6b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeBit- 00:07:32.428 [2024-11-30 00:05:57.950315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.428 [2024-11-30 00:05:57.950345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.688 #7 NEW cov: 11761 ft: 13622 corp: 6/7b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeBit- 00:07:32.688 [2024-11-30 00:05:58.010577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.010613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.688 [2024-11-30 00:05:58.010646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.010661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.688 #8 NEW cov: 11761 ft: 13652 corp: 7/9b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 InsertByte- 00:07:32.688 [2024-11-30 00:05:58.070739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.070770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.688 [2024-11-30 00:05:58.070802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.070818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.688 #9 NEW cov: 11761 ft: 13781 corp: 8/11b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ChangeBit- 00:07:32.688 [2024-11-30 00:05:58.141076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.141107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.688 [2024-11-30 00:05:58.141140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.141156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.688 [2024-11-30 00:05:58.141184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.141199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.688 [2024-11-30 00:05:58.141228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.141243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.688 #10 NEW cov: 11761 ft: 14148 corp: 9/15b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 CopyPart- 00:07:32.688 [2024-11-30 00:05:58.201080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.201111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.688 [2024-11-30 00:05:58.201144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.688 [2024-11-30 00:05:58.201160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.947 #11 NEW cov: 11761 ft: 14239 corp: 10/17b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 CopyPart- 00:07:32.947 [2024-11-30 00:05:58.271220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.271251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.947 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:32.947 #12 NEW cov: 11778 ft: 14290 corp: 11/18b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 ChangeByte- 00:07:32.947 [2024-11-30 00:05:58.321323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.321353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.947 #13 NEW cov: 11778 ft: 14335 corp: 12/19b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 CopyPart- 00:07:32.947 [2024-11-30 00:05:58.371477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.371508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.947 [2024-11-30 00:05:58.371540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.371555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.947 #14 NEW cov: 11778 ft: 14436 corp: 13/21b lim: 5 exec/s: 14 rss: 69Mb L: 2/4 MS: 1 CopyPart- 00:07:32.947 [2024-11-30 00:05:58.441711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.441742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.947 [2024-11-30 00:05:58.441774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.441789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.947 #15 NEW cov: 11778 ft: 14440 corp: 14/23b lim: 5 exec/s: 15 rss: 69Mb L: 2/4 MS: 1 ChangeByte- 00:07:32.947 [2024-11-30 00:05:58.491836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.947 [2024-11-30 00:05:58.491867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.948 [2024-11-30 00:05:58.491900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.948 [2024-11-30 00:05:58.491916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.207 #16 NEW cov: 11778 ft: 14517 corp: 15/25b lim: 5 exec/s: 16 rss: 69Mb L: 2/4 MS: 1 ChangeBit- 00:07:33.207 [2024-11-30 00:05:58.561955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.207 [2024-11-30 00:05:58.561984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.207 #17 NEW cov: 11778 ft: 14531 corp: 16/26b lim: 5 exec/s: 17 rss: 69Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:33.207 [2024-11-30 00:05:58.622252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.207 [2024-11-30 00:05:58.622283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.207 [2024-11-30 00:05:58.622314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.207 [2024-11-30 00:05:58.622329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.207 [2024-11-30 00:05:58.622356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.207 [2024-11-30 00:05:58.622370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.207 #18 NEW cov: 11778 ft: 14746 corp: 17/29b lim: 5 exec/s: 18 rss: 69Mb L: 3/4 MS: 1 InsertByte- 00:07:33.207 [2024-11-30 00:05:58.672210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.207 [2024-11-30 00:05:58.672241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.207 #19 NEW cov: 11778 ft: 14819 corp: 18/30b lim: 5 exec/s: 19 rss: 69Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:33.207 [2024-11-30 00:05:58.732413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.207 [2024-11-30 00:05:58.732444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.467 #20 NEW cov: 11778 ft: 14825 corp: 19/31b lim: 5 exec/s: 20 rss: 69Mb L: 1/4 MS: 1 EraseBytes- 00:07:33.467 [2024-11-30 00:05:58.792893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.792924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.792957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.792973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.793001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.793016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.793044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.793060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.793093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.793108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.467 #21 NEW cov: 11778 ft: 14921 corp: 20/36b lim: 5 exec/s: 21 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:33.467 [2024-11-30 00:05:58.862766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.862796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.467 #22 NEW cov: 11778 ft: 14936 corp: 21/37b lim: 5 exec/s: 22 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:33.467 [2024-11-30 00:05:58.923173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.923203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.923234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.923265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.923293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.923308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.923336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.923352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.467 [2024-11-30 00:05:58.923380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.923395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.467 #23 NEW cov: 11778 ft: 14949 corp: 22/42b lim: 5 exec/s: 23 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:33.467 [2024-11-30 00:05:58.993162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.467 [2024-11-30 00:05:58.993195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.727 #24 NEW cov: 11778 ft: 14962 corp: 23/43b lim: 5 exec/s: 24 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:07:33.727 [2024-11-30 00:05:59.053269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.727 [2024-11-30 00:05:59.053300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.728 #25 NEW cov: 11778 ft: 14977 corp: 24/44b lim: 5 exec/s: 25 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:33.728 [2024-11-30 00:05:59.113721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.113762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.113794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.113812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.113840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.113854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.113880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.113895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.113921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.113936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.728 #26 NEW cov: 11778 ft: 14992 corp: 25/49b lim: 5 exec/s: 26 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\006\000\000\000"- 00:07:33.728 [2024-11-30 00:05:59.183706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.183738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.183770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.183786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.728 #27 NEW cov: 11778 ft: 15072 corp: 26/51b lim: 5 exec/s: 27 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:33.728 [2024-11-30 00:05:59.233958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.233987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.234018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.234033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.234060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.234075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.234102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.234116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.728 [2024-11-30 00:05:59.234143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.728 [2024-11-30 00:05:59.234157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.728 #28 NEW cov: 11778 ft: 15097 corp: 27/56b lim: 5 exec/s: 28 rss: 70Mb L: 5/5 MS: 1 PersAutoDict- DE: "\006\000\000\000"- 00:07:33.988 [2024-11-30 00:05:59.284160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.284192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.284226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.284242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.284271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.284287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.284315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.284331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.284360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.284375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.988 #29 NEW cov: 11785 ft: 15130 corp: 28/61b lim: 5 exec/s: 29 rss: 70Mb L: 5/5 MS: 1 PersAutoDict- DE: "\006\000\000\000"- 00:07:33.988 [2024-11-30 00:05:59.354344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.354374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.354405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.354420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.354446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.354461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.354488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.354502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.988 [2024-11-30 00:05:59.354528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.988 [2024-11-30 00:05:59.354543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.988 #30 NEW cov: 11785 ft: 15168 corp: 29/66b lim: 5 exec/s: 15 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:33.988 #30 DONE cov: 11785 ft: 15168 corp: 29/66b lim: 5 exec/s: 15 rss: 70Mb 00:07:33.988 ###### Recommended dictionary. ###### 00:07:33.988 "\006\000\000\000" # Uses: 2 00:07:33.988 ###### End of recommended dictionary. ###### 00:07:33.988 Done 30 runs in 2 second(s) 00:07:33.988 00:05:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:33.988 00:05:59 -- ../common.sh@72 -- # (( i++ )) 00:07:33.988 00:05:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.988 00:05:59 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:33.988 00:05:59 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:33.988 00:05:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.988 00:05:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.988 00:05:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:33.988 00:05:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:33.988 00:05:59 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:33.988 00:05:59 -- nvmf/run.sh@29 -- # port=4410 00:07:33.988 00:05:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:33.988 00:05:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:33.988 00:05:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.247 00:05:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:34.247 [2024-11-30 00:05:59.569349] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:34.247 [2024-11-30 00:05:59.569416] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2724999 ] 00:07:34.247 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.506 [2024-11-30 00:05:59.818183] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.506 [2024-11-30 00:05:59.909356] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.506 [2024-11-30 00:05:59.909492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.506 [2024-11-30 00:05:59.967713] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.506 [2024-11-30 00:05:59.984066] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:34.506 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.506 INFO: Seed: 932281931 00:07:34.506 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:34.506 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:34.506 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:34.506 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.507 #2 INITED exec/s: 0 rss: 60Mb 00:07:34.507 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.507 This may also happen if the target rejected all inputs we tried so far 00:07:34.507 [2024-11-30 00:06:00.029442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.507 [2024-11-30 00:06:00.029475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.507 [2024-11-30 00:06:00.029535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.507 [2024-11-30 00:06:00.029549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.507 [2024-11-30 00:06:00.029612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00002a01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.507 [2024-11-30 00:06:00.029626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.075 NEW_FUNC[1/670]: 0x447688 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:35.075 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.075 #7 NEW cov: 11576 ft: 11577 corp: 2/26b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 5 ChangeBit-ShuffleBytes-CMP-ShuffleBytes-InsertRepeatedBytes- DE: "\001@"- 00:07:35.075 [2024-11-30 00:06:00.361931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.361989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.362138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.362165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.362319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.362344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.362496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.362523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.362675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.362700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.075 #8 NEW cov: 11693 ft: 12786 corp: 3/66b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:35.075 [2024-11-30 00:06:00.411865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.411894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.412015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.412035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.412169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.412186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.412317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72727272 cdw11:72727200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.412335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.412462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.412482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.075 #9 NEW cov: 11699 ft: 13012 corp: 4/106b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:07:35.075 [2024-11-30 00:06:00.462034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.462062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.462189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.462208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.462337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.462354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.462479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.462497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.462627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.462644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.075 #15 NEW cov: 11784 ft: 13297 corp: 5/146b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:07:35.075 [2024-11-30 00:06:00.502212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.502240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.502359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.502376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.502502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.502518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.502641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.502659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.075 [2024-11-30 00:06:00.502781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0172 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.502798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.075 #16 NEW cov: 11784 ft: 13363 corp: 6/186b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CopyPart- 00:07:35.075 [2024-11-30 00:06:00.542301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.075 [2024-11-30 00:06:00.542329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.542458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.542480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.542584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.542603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.542731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.542748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.542882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.542899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.076 #17 NEW cov: 11784 ft: 13446 corp: 7/226b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:07:35.076 [2024-11-30 00:06:00.592187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.592215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.592346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.592363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.592497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.592514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.076 [2024-11-30 00:06:00.592639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.076 [2024-11-30 00:06:00.592658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.076 #18 NEW cov: 11784 ft: 13533 corp: 8/258b lim: 40 exec/s: 0 rss: 68Mb L: 32/40 MS: 1 CopyPart- 00:07:35.335 [2024-11-30 00:06:00.632577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.632609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.632731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:72720000 cdw11:00007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.632750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.632881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.632899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.633029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.633050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.633180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.633197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.335 #19 NEW cov: 11784 ft: 13558 corp: 9/298b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:07:35.335 [2024-11-30 00:06:00.671980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.672007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.672138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.672157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.335 #20 NEW cov: 11784 ft: 13806 corp: 10/320b lim: 40 exec/s: 0 rss: 68Mb L: 22/40 MS: 1 EraseBytes- 00:07:35.335 [2024-11-30 00:06:00.722630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.722658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.722797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.722814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.722945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.722962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.723091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.723108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.335 #21 NEW cov: 11784 ft: 13883 corp: 11/356b lim: 40 exec/s: 0 rss: 68Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:07:35.335 [2024-11-30 00:06:00.762301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.762328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.762455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.762472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.335 #22 NEW cov: 11784 ft: 13899 corp: 12/379b lim: 40 exec/s: 0 rss: 68Mb L: 23/40 MS: 1 EraseBytes- 00:07:35.335 [2024-11-30 00:06:00.803031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.335 [2024-11-30 00:06:00.803058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.335 [2024-11-30 00:06:00.803194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.803212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.336 [2024-11-30 00:06:00.803341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.803360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.336 [2024-11-30 00:06:00.803485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.803501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.336 [2024-11-30 00:06:00.803629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:402a0172 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.803648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.336 #23 NEW cov: 11784 ft: 13930 corp: 13/419b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:07:35.336 [2024-11-30 00:06:00.852799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.852826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.336 [2024-11-30 00:06:00.852957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.852974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.336 [2024-11-30 00:06:00.853102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.336 [2024-11-30 00:06:00.853120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.336 #24 NEW cov: 11784 ft: 13956 corp: 14/449b lim: 40 exec/s: 0 rss: 69Mb L: 30/40 MS: 1 EraseBytes- 00:07:35.595 [2024-11-30 00:06:00.892761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a007272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.892788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.892921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.892939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.595 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.595 #25 NEW cov: 11807 ft: 14028 corp: 15/471b lim: 40 exec/s: 0 rss: 69Mb L: 22/40 MS: 1 CrossOver- 00:07:35.595 [2024-11-30 00:06:00.933612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:019446fa cdw11:ba7dd566 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.933640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.933758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.933779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.933914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.933932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.934067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.934083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.934210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0172 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.934228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.595 #26 NEW cov: 11807 ft: 14076 corp: 16/511b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CMP- DE: "\001\224F\372\272}\325f"- 00:07:35.595 [2024-11-30 00:06:00.973344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.973371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.973496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.973514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.595 [2024-11-30 00:06:00.973649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72720000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:00.973667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.595 #27 NEW cov: 11807 ft: 14116 corp: 17/542b lim: 40 exec/s: 0 rss: 69Mb L: 31/40 MS: 1 CopyPart- 00:07:35.595 [2024-11-30 00:06:01.013826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.595 [2024-11-30 00:06:01.013852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.013983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:72720000 cdw11:00007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.014001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.014127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.014144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.014271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72727272 cdw11:72000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.014290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.014426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.014454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.596 #28 NEW cov: 11807 ft: 14199 corp: 18/582b lim: 40 exec/s: 28 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:35.596 [2024-11-30 00:06:01.063571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.063602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.063737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00720072 cdw11:72720072 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.063754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.063881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.063900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.064024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.064040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.064166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.064182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.596 #29 NEW cov: 11807 ft: 14235 corp: 19/622b lim: 40 exec/s: 29 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:35.596 [2024-11-30 00:06:01.104117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.104145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.104276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.104297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.104413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.104431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.104554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.104573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.104696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.104713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.596 #30 NEW cov: 11807 ft: 14258 corp: 20/662b lim: 40 exec/s: 30 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:07:35.596 [2024-11-30 00:06:01.143535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.143565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.596 [2024-11-30 00:06:01.143708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.596 [2024-11-30 00:06:01.143726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.855 #31 NEW cov: 11807 ft: 14285 corp: 21/685b lim: 40 exec/s: 31 rss: 69Mb L: 23/40 MS: 1 EraseBytes- 00:07:35.855 [2024-11-30 00:06:01.184333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.184363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.184491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.184509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.184642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.184659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.184794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720047 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.184811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.184943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.184960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.855 #32 NEW cov: 11807 ft: 14298 corp: 22/725b lim: 40 exec/s: 32 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:07:35.855 [2024-11-30 00:06:01.234035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.234062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.234202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.234219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.234341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:00007200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.234358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.855 #33 NEW cov: 11807 ft: 14308 corp: 23/755b lim: 40 exec/s: 33 rss: 69Mb L: 30/40 MS: 1 ShuffleBytes- 00:07:35.855 [2024-11-30 00:06:01.274109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.274136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.274262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.274279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.274398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00002a01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.274415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.855 #34 NEW cov: 11807 ft: 14322 corp: 24/780b lim: 40 exec/s: 34 rss: 69Mb L: 25/40 MS: 1 ChangeBit- 00:07:35.855 [2024-11-30 00:06:01.314634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.314661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.314793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:72720000 cdw11:00007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.314811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.314936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:28000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.314953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.315072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72727272 cdw11:72000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.315090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.315221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.315239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:35.855 #35 NEW cov: 11807 ft: 14337 corp: 25/820b lim: 40 exec/s: 35 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:35.855 [2024-11-30 00:06:01.364201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0000faff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.364228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.855 [2024-11-30 00:06:01.364352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.855 [2024-11-30 00:06:01.364371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.855 #36 NEW cov: 11807 ft: 14343 corp: 26/843b lim: 40 exec/s: 36 rss: 69Mb L: 23/40 MS: 1 ChangeBinInt- 00:07:36.113 [2024-11-30 00:06:01.414748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00008800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.414785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.414910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.414928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.415085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.415103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.415232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.415250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.113 #37 NEW cov: 11807 ft: 14354 corp: 27/875b lim: 40 exec/s: 37 rss: 69Mb L: 32/40 MS: 1 ChangeByte- 00:07:36.113 [2024-11-30 00:06:01.455164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.455191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.455303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.455321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.455455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.455474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.455603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.455619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.455749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.455767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.113 #38 NEW cov: 11807 ft: 14364 corp: 28/915b lim: 40 exec/s: 38 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:07:36.113 [2024-11-30 00:06:01.494854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.494882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.495012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.495031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.495152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:7272722a cdw11:01400140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.495169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.113 #39 NEW cov: 11807 ft: 14370 corp: 29/939b lim: 40 exec/s: 39 rss: 69Mb L: 24/40 MS: 1 PersAutoDict- DE: "\001@"- 00:07:36.113 [2024-11-30 00:06:01.545395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.545422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.545553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.545572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.545704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.545721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.545844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720047 cdw11:00003f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.545862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.545987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.546006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.113 #40 NEW cov: 11807 ft: 14383 corp: 30/979b lim: 40 exec/s: 40 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:36.113 [2024-11-30 00:06:01.594704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.594732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.113 #41 NEW cov: 11807 ft: 14716 corp: 31/993b lim: 40 exec/s: 41 rss: 70Mb L: 14/40 MS: 1 EraseBytes- 00:07:36.113 [2024-11-30 00:06:01.635695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.635721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.113 [2024-11-30 00:06:01.635849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.113 [2024-11-30 00:06:01.635867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.114 [2024-11-30 00:06:01.635992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72720140 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.114 [2024-11-30 00:06:01.636010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.114 [2024-11-30 00:06:01.636144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720047 cdw11:00003f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.114 [2024-11-30 00:06:01.636162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.114 [2024-11-30 00:06:01.636290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.114 [2024-11-30 00:06:01.636308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.114 #42 NEW cov: 11807 ft: 14730 corp: 32/1033b lim: 40 exec/s: 42 rss: 70Mb L: 40/40 MS: 1 PersAutoDict- DE: "\001@"- 00:07:36.372 [2024-11-30 00:06:01.685429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.685459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.685583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:72720000 cdw11:00007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.685621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.685750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.685767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.372 #43 NEW cov: 11807 ft: 14754 corp: 33/1064b lim: 40 exec/s: 43 rss: 70Mb L: 31/40 MS: 1 CrossOver- 00:07:36.372 [2024-11-30 00:06:01.735560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:9446faba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.735589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.735727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:7dd56672 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.735745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.735868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:00007200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.735885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.372 #44 NEW cov: 11807 ft: 14796 corp: 34/1094b lim: 40 exec/s: 44 rss: 70Mb L: 30/40 MS: 1 PersAutoDict- DE: "\001\224F\372\272}\325f"- 00:07:36.372 [2024-11-30 00:06:01.786176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.786204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.786333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.786350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.786474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.786491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.786618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72727247 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.786637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.786770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.786788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.372 #45 NEW cov: 11807 ft: 14834 corp: 35/1134b lim: 40 exec/s: 45 rss: 70Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:36.372 [2024-11-30 00:06:01.826063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.826095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.826220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.826239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.826369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.826386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.826516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.826533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.372 #46 NEW cov: 11807 ft: 14838 corp: 36/1169b lim: 40 exec/s: 46 rss: 70Mb L: 35/40 MS: 1 CrossOver- 00:07:36.372 [2024-11-30 00:06:01.866414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.866441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.866563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.866580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.866713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.866730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.866856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:79720000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.866874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.867005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.867024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.372 #47 NEW cov: 11807 ft: 14845 corp: 37/1209b lim: 40 exec/s: 47 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:36.372 [2024-11-30 00:06:01.906628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.372 [2024-11-30 00:06:01.906656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.372 [2024-11-30 00:06:01.906797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.373 [2024-11-30 00:06:01.906816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.373 [2024-11-30 00:06:01.906896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.373 [2024-11-30 00:06:01.906917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.373 [2024-11-30 00:06:01.907043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:72728047 cdw11:00003f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.373 [2024-11-30 00:06:01.907060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.373 [2024-11-30 00:06:01.907193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:002a0140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.373 [2024-11-30 00:06:01.907210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.632 #48 NEW cov: 11807 ft: 14880 corp: 38/1249b lim: 40 exec/s: 48 rss: 70Mb L: 40/40 MS: 1 ChangeBit- 00:07:36.632 [2024-11-30 00:06:01.946309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000072 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.632 [2024-11-30 00:06:01.946338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.632 [2024-11-30 00:06:01.946474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.632 [2024-11-30 00:06:01.946492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.632 [2024-11-30 00:06:01.946621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72720000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.632 [2024-11-30 00:06:01.946639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.632 #49 NEW cov: 11807 ft: 14906 corp: 39/1280b lim: 40 exec/s: 49 rss: 70Mb L: 31/40 MS: 1 CopyPart- 00:07:36.632 [2024-11-30 00:06:01.985953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.632 [2024-11-30 00:06:01.985981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.632 [2024-11-30 00:06:02.025996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.632 [2024-11-30 00:06:02.026023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.632 #51 NEW cov: 11807 ft: 14955 corp: 40/1294b lim: 40 exec/s: 25 rss: 70Mb L: 14/40 MS: 2 CopyPart-CopyPart- 00:07:36.632 #51 DONE cov: 11807 ft: 14955 corp: 40/1294b lim: 40 exec/s: 25 rss: 70Mb 00:07:36.632 ###### Recommended dictionary. ###### 00:07:36.632 "\001@" # Uses: 4 00:07:36.632 "\001\224F\372\272}\325f" # Uses: 1 00:07:36.632 ###### End of recommended dictionary. ###### 00:07:36.632 Done 51 runs in 2 second(s) 00:07:36.632 00:06:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:36.632 00:06:02 -- ../common.sh@72 -- # (( i++ )) 00:07:36.632 00:06:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.632 00:06:02 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:36.632 00:06:02 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:36.632 00:06:02 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.632 00:06:02 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.632 00:06:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:36.632 00:06:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:36.632 00:06:02 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:36.632 00:06:02 -- nvmf/run.sh@29 -- # port=4411 00:07:36.632 00:06:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:36.632 00:06:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:36.632 00:06:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.891 00:06:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:36.891 [2024-11-30 00:06:02.215576] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:36.891 [2024-11-30 00:06:02.215684] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2725643 ] 00:07:36.891 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.150 [2024-11-30 00:06:02.465355] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.150 [2024-11-30 00:06:02.557959] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.151 [2024-11-30 00:06:02.558095] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.151 [2024-11-30 00:06:02.616267] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.151 [2024-11-30 00:06:02.632654] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:37.151 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.151 INFO: Seed: 3578271274 00:07:37.151 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:37.151 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:37.151 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:37.151 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.151 #2 INITED exec/s: 0 rss: 60Mb 00:07:37.151 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.151 This may also happen if the target rejected all inputs we tried so far 00:07:37.151 [2024-11-30 00:06:02.702527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.151 [2024-11-30 00:06:02.702565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.669 NEW_FUNC[1/671]: 0x4493f8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:37.669 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.669 #5 NEW cov: 11592 ft: 11593 corp: 2/11b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 3 ChangeByte-InsertByte-CMP- DE: "o\000\000\000\000\000\000\000"- 00:07:37.669 [2024-11-30 00:06:03.022590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02a50000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.669 [2024-11-30 00:06:03.022632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.669 #8 NEW cov: 11705 ft: 12345 corp: 3/24b lim: 40 exec/s: 0 rss: 68Mb L: 13/13 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:37.669 [2024-11-30 00:06:03.062689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.670 [2024-11-30 00:06:03.062718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.670 #14 NEW cov: 11711 ft: 12568 corp: 4/35b lim: 40 exec/s: 0 rss: 68Mb L: 11/13 MS: 1 CrossOver- 00:07:37.670 [2024-11-30 00:06:03.102796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a6f0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.670 [2024-11-30 00:06:03.102839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.670 #15 NEW cov: 11796 ft: 12825 corp: 5/47b lim: 40 exec/s: 0 rss: 68Mb L: 12/13 MS: 1 InsertByte- 00:07:37.670 [2024-11-30 00:06:03.142915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.670 [2024-11-30 00:06:03.142942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.670 #19 NEW cov: 11796 ft: 12910 corp: 6/58b lim: 40 exec/s: 0 rss: 68Mb L: 11/13 MS: 4 ChangeByte-CopyPart-CopyPart-CMP- DE: "\000\000\000\000\000\000\004\000"- 00:07:37.670 [2024-11-30 00:06:03.183036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a6f0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.670 [2024-11-30 00:06:03.183063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.670 #20 NEW cov: 11796 ft: 13005 corp: 7/70b lim: 40 exec/s: 0 rss: 68Mb L: 12/13 MS: 1 CopyPart- 00:07:37.670 [2024-11-30 00:06:03.223163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a6f0a cdw11:00510000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.670 [2024-11-30 00:06:03.223188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.928 #21 NEW cov: 11796 ft: 13059 corp: 8/83b lim: 40 exec/s: 0 rss: 68Mb L: 13/13 MS: 1 InsertByte- 00:07:37.928 [2024-11-30 00:06:03.263209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02a50000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.263239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.928 #22 NEW cov: 11796 ft: 13100 corp: 9/96b lim: 40 exec/s: 0 rss: 68Mb L: 13/13 MS: 1 ChangeByte- 00:07:37.928 [2024-11-30 00:06:03.303666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.303695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.928 [2024-11-30 00:06:03.303832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff94 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.303851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.928 #23 NEW cov: 11796 ft: 13890 corp: 10/114b lim: 40 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:07:37.928 [2024-11-30 00:06:03.353564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.353591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.928 #26 NEW cov: 11796 ft: 13947 corp: 11/123b lim: 40 exec/s: 0 rss: 69Mb L: 9/18 MS: 3 CrossOver-CrossOver-CrossOver- 00:07:37.928 [2024-11-30 00:06:03.393669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000040 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.393696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.928 #27 NEW cov: 11796 ft: 13968 corp: 12/134b lim: 40 exec/s: 0 rss: 69Mb L: 11/18 MS: 1 ChangeBit- 00:07:37.928 [2024-11-30 00:06:03.434113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.434140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.928 [2024-11-30 00:06:03.434250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.928 [2024-11-30 00:06:03.434267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.928 #28 NEW cov: 11796 ft: 14010 corp: 13/152b lim: 40 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:07:38.187 [2024-11-30 00:06:03.484213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.484241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.187 [2024-11-30 00:06:03.484347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:006f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.484364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.187 #29 NEW cov: 11796 ft: 14034 corp: 14/170b lim: 40 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 PersAutoDict- DE: "o\000\000\000\000\000\000\000"- 00:07:38.187 [2024-11-30 00:06:03.524284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02a502a5 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.524312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.187 [2024-11-30 00:06:03.524420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.524438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.187 #30 NEW cov: 11796 ft: 14041 corp: 15/193b lim: 40 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 CrossOver- 00:07:38.187 [2024-11-30 00:06:03.564186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02a50000 cdw11:b6000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.564215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.187 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.187 #31 NEW cov: 11819 ft: 14146 corp: 16/206b lim: 40 exec/s: 0 rss: 69Mb L: 13/23 MS: 1 ChangeByte- 00:07:38.187 [2024-11-30 00:06:03.614095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.614123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.187 #32 NEW cov: 11819 ft: 14173 corp: 17/214b lim: 40 exec/s: 0 rss: 69Mb L: 8/23 MS: 1 EraseBytes- 00:07:38.187 [2024-11-30 00:06:03.654757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.654784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.187 [2024-11-30 00:06:03.654881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:006f0000 cdw11:00270000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.654901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.187 #33 NEW cov: 11819 ft: 14201 corp: 18/232b lim: 40 exec/s: 33 rss: 69Mb L: 18/23 MS: 1 ChangeByte- 00:07:38.187 [2024-11-30 00:06:03.704865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.704909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.187 [2024-11-30 00:06:03.705022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:006f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.187 [2024-11-30 00:06:03.705040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.187 #34 NEW cov: 11819 ft: 14215 corp: 19/250b lim: 40 exec/s: 34 rss: 69Mb L: 18/23 MS: 1 PersAutoDict- DE: "o\000\000\000\000\000\000\000"- 00:07:38.447 [2024-11-30 00:06:03.744832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a6f0a cdw11:00510000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.744861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.447 [2024-11-30 00:06:03.744973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:a16f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.744990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.447 #35 NEW cov: 11819 ft: 14243 corp: 20/271b lim: 40 exec/s: 35 rss: 69Mb L: 21/23 MS: 1 PersAutoDict- DE: "o\000\000\000\000\000\000\000"- 00:07:38.447 [2024-11-30 00:06:03.794460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0a00 cdw11:00000010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.794486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.447 #36 NEW cov: 11819 ft: 14263 corp: 21/279b lim: 40 exec/s: 36 rss: 69Mb L: 8/23 MS: 1 ChangeBit- 00:07:38.447 [2024-11-30 00:06:03.835284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.835311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.447 [2024-11-30 00:06:03.835418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:006f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.835436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.447 [2024-11-30 00:06:03.835566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000a1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.835581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.447 #37 NEW cov: 11819 ft: 14551 corp: 22/303b lim: 40 exec/s: 37 rss: 69Mb L: 24/24 MS: 1 CopyPart- 00:07:38.447 [2024-11-30 00:06:03.885168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:11000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.885196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.447 #38 NEW cov: 11819 ft: 14618 corp: 23/313b lim: 40 exec/s: 38 rss: 69Mb L: 10/24 MS: 1 ChangeByte- 00:07:38.447 [2024-11-30 00:06:03.925252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a2c6f0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.925280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.447 #39 NEW cov: 11819 ft: 14628 corp: 24/322b lim: 40 exec/s: 39 rss: 69Mb L: 9/24 MS: 1 InsertByte- 00:07:38.447 [2024-11-30 00:06:03.966133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.966165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.447 [2024-11-30 00:06:03.966254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff94 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.966271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.447 [2024-11-30 00:06:03.966395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:94000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.966412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.447 [2024-11-30 00:06:03.966544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.447 [2024-11-30 00:06:03.966559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.447 #40 NEW cov: 11819 ft: 14920 corp: 25/361b lim: 40 exec/s: 40 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:38.706 [2024-11-30 00:06:04.005742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:1100006f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.005769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.706 [2024-11-30 00:06:04.005877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00001100 cdw11:000000a1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.005894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.706 #41 NEW cov: 11819 ft: 14976 corp: 26/377b lim: 40 exec/s: 41 rss: 69Mb L: 16/39 MS: 1 CopyPart- 00:07:38.706 [2024-11-30 00:06:04.055900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:a5000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.055927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.706 [2024-11-30 00:06:04.056019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000002a5 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.056036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.706 #42 NEW cov: 11819 ft: 14995 corp: 27/400b lim: 40 exec/s: 42 rss: 70Mb L: 23/39 MS: 1 CopyPart- 00:07:38.706 [2024-11-30 00:06:04.095747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.095774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.706 #43 NEW cov: 11819 ft: 15008 corp: 28/408b lim: 40 exec/s: 43 rss: 70Mb L: 8/39 MS: 1 ShuffleBytes- 00:07:38.706 [2024-11-30 00:06:04.135921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a7a6f cdw11:0a006f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.135948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.706 #47 NEW cov: 11819 ft: 15016 corp: 29/421b lim: 40 exec/s: 47 rss: 70Mb L: 13/39 MS: 4 EraseBytes-ShuffleBytes-EraseBytes-CrossOver- 00:07:38.706 [2024-11-30 00:06:04.176307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:a5000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.176337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.706 [2024-11-30 00:06:04.176423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000002a5 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.176440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.706 #48 NEW cov: 11819 ft: 15049 corp: 30/444b lim: 40 exec/s: 48 rss: 70Mb L: 23/39 MS: 1 CopyPart- 00:07:38.706 [2024-11-30 00:06:04.216926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.216952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.706 [2024-11-30 00:06:04.217054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff94 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.217071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.706 [2024-11-30 00:06:04.217189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:94000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.217205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.706 [2024-11-30 00:06:04.217317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.706 [2024-11-30 00:06:04.217334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.706 #49 NEW cov: 11819 ft: 15055 corp: 31/483b lim: 40 exec/s: 49 rss: 70Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:38.966 [2024-11-30 00:06:04.266834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.266863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.966 [2024-11-30 00:06:04.266962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:006f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.266980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.966 [2024-11-30 00:06:04.267087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7a6f0a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.267103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.966 #50 NEW cov: 11819 ft: 15061 corp: 32/508b lim: 40 exec/s: 50 rss: 70Mb L: 25/39 MS: 1 CrossOver- 00:07:38.966 [2024-11-30 00:06:04.306959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a7a6f cdw11:0a006f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.306988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.966 [2024-11-30 00:06:04.307079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:9d9d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.307097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.966 [2024-11-30 00:06:04.307212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:9d9d9d9d cdw11:9da10000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.307234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.966 #51 NEW cov: 11819 ft: 15070 corp: 33/534b lim: 40 exec/s: 51 rss: 70Mb L: 26/39 MS: 1 InsertRepeatedBytes- 00:07:38.966 [2024-11-30 00:06:04.356741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.356767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.966 [2024-11-30 00:06:04.356873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.356891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.966 #52 NEW cov: 11819 ft: 15101 corp: 34/552b lim: 40 exec/s: 52 rss: 70Mb L: 18/39 MS: 1 ShuffleBytes- 00:07:38.966 [2024-11-30 00:06:04.396667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a6f0a00 cdw11:00000026 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.396694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.966 #53 NEW cov: 11819 ft: 15162 corp: 35/563b lim: 40 exec/s: 53 rss: 70Mb L: 11/39 MS: 1 ChangeByte- 00:07:38.966 [2024-11-30 00:06:04.436820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00960000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.436847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.966 #54 NEW cov: 11819 ft: 15173 corp: 36/573b lim: 40 exec/s: 54 rss: 70Mb L: 10/39 MS: 1 InsertByte- 00:07:38.966 [2024-11-30 00:06:04.476901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00001100 cdw11:000000a1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.966 [2024-11-30 00:06:04.476928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 #55 NEW cov: 11819 ft: 15189 corp: 37/581b lim: 40 exec/s: 55 rss: 70Mb L: 8/39 MS: 1 EraseBytes- 00:07:38.967 [2024-11-30 00:06:04.517100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000000ad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-30 00:06:04.517127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.227 #56 NEW cov: 11819 ft: 15217 corp: 38/592b lim: 40 exec/s: 56 rss: 70Mb L: 11/39 MS: 1 EraseBytes- 00:07:39.227 [2024-11-30 00:06:04.557704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2f7a6f0a cdw11:0051000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.227 [2024-11-30 00:06:04.557731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.227 [2024-11-30 00:06:04.557825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.227 [2024-11-30 00:06:04.557841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.227 [2024-11-30 00:06:04.557954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:a16f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.227 [2024-11-30 00:06:04.557971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.227 #57 NEW cov: 11819 ft: 15232 corp: 39/617b lim: 40 exec/s: 57 rss: 70Mb L: 25/39 MS: 1 CMP- DE: "\012\000\000\000"- 00:07:39.227 [2024-11-30 00:06:04.607325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02a500fe cdw11:b5000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.227 [2024-11-30 00:06:04.607356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.227 #58 NEW cov: 11819 ft: 15252 corp: 40/630b lim: 40 exec/s: 58 rss: 70Mb L: 13/39 MS: 1 ChangeBinInt- 00:07:39.227 [2024-11-30 00:06:04.647443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a2c6fff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.227 [2024-11-30 00:06:04.647471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.227 #59 NEW cov: 11819 ft: 15260 corp: 41/643b lim: 40 exec/s: 59 rss: 70Mb L: 13/39 MS: 1 InsertRepeatedBytes- 00:07:39.227 #59 DONE cov: 11819 ft: 15260 corp: 41/643b lim: 40 exec/s: 29 rss: 70Mb 00:07:39.227 ###### Recommended dictionary. ###### 00:07:39.227 "o\000\000\000\000\000\000\000" # Uses: 3 00:07:39.227 "\000\000\000\000\000\000\004\000" # Uses: 1 00:07:39.227 "\012\000\000\000" # Uses: 0 00:07:39.227 ###### End of recommended dictionary. ###### 00:07:39.227 Done 59 runs in 2 second(s) 00:07:39.487 00:06:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:39.487 00:06:04 -- ../common.sh@72 -- # (( i++ )) 00:07:39.487 00:06:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.487 00:06:04 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:39.487 00:06:04 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:39.487 00:06:04 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.487 00:06:04 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.487 00:06:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:39.487 00:06:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:39.487 00:06:04 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:39.487 00:06:04 -- nvmf/run.sh@29 -- # port=4412 00:07:39.487 00:06:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:39.487 00:06:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:39.487 00:06:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.487 00:06:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:39.487 [2024-11-30 00:06:04.842098] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:39.487 [2024-11-30 00:06:04.842163] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2726241 ] 00:07:39.487 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.746 [2024-11-30 00:06:05.097577] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.746 [2024-11-30 00:06:05.176173] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.746 [2024-11-30 00:06:05.176365] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.746 [2024-11-30 00:06:05.235018] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.746 [2024-11-30 00:06:05.251385] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:39.746 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.746 INFO: Seed: 1903325468 00:07:39.746 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:39.746 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:39.746 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:39.746 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.746 #2 INITED exec/s: 0 rss: 60Mb 00:07:39.746 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.746 This may also happen if the target rejected all inputs we tried so far 00:07:40.004 [2024-11-30 00:06:05.318675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.004 [2024-11-30 00:06:05.318715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.004 [2024-11-30 00:06:05.318778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.004 [2024-11-30 00:06:05.318794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.004 [2024-11-30 00:06:05.318861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.004 [2024-11-30 00:06:05.318877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.264 NEW_FUNC[1/671]: 0x44b168 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:40.264 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.264 #13 NEW cov: 11590 ft: 11591 corp: 2/29b lim: 40 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:40.264 [2024-11-30 00:06:05.617803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.617838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.617894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.617908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.617961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.617974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.264 #14 NEW cov: 11703 ft: 12250 corp: 3/57b lim: 40 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 ChangeBit- 00:07:40.264 [2024-11-30 00:06:05.667716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.667743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.667791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.667805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.264 #20 NEW cov: 11709 ft: 12690 corp: 4/75b lim: 40 exec/s: 0 rss: 68Mb L: 18/28 MS: 1 EraseBytes- 00:07:40.264 [2024-11-30 00:06:05.707937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.707965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.708019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.708034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.708091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.708105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.264 #21 NEW cov: 11794 ft: 12915 corp: 5/104b lim: 40 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 InsertByte- 00:07:40.264 [2024-11-30 00:06:05.748051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.748077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.748112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.748125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.748176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff60 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.748189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.264 #24 NEW cov: 11794 ft: 13046 corp: 6/129b lim: 40 exec/s: 0 rss: 68Mb L: 25/29 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:40.264 [2024-11-30 00:06:05.788177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.788202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.788247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.788261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.264 [2024-11-30 00:06:05.788311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.264 [2024-11-30 00:06:05.788324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.264 #25 NEW cov: 11794 ft: 13164 corp: 7/157b lim: 40 exec/s: 0 rss: 68Mb L: 28/29 MS: 1 ShuffleBytes- 00:07:40.524 [2024-11-30 00:06:05.828431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.828456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.828511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.828524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.828575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e405 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.828587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.828642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:000000e4 cdw11:e4e4e40a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.828658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.524 #26 NEW cov: 11794 ft: 13567 corp: 8/189b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 CMP- DE: "\005\000\000\000"- 00:07:40.524 [2024-11-30 00:06:05.868397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.868423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.868486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.868506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.868580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:00e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.868605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.524 #27 NEW cov: 11794 ft: 13590 corp: 9/218b lim: 40 exec/s: 0 rss: 68Mb L: 29/32 MS: 1 InsertByte- 00:07:40.524 [2024-11-30 00:06:05.908546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.908571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.908627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.908641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.908694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.908707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.524 #28 NEW cov: 11794 ft: 13665 corp: 10/247b lim: 40 exec/s: 0 rss: 68Mb L: 29/32 MS: 1 ShuffleBytes- 00:07:40.524 [2024-11-30 00:06:05.948638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.948663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.948718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.948732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.948783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.948797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.524 #29 NEW cov: 11794 ft: 13729 corp: 11/275b lim: 40 exec/s: 0 rss: 68Mb L: 28/32 MS: 1 CopyPart- 00:07:40.524 [2024-11-30 00:06:05.988607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.988633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.524 [2024-11-30 00:06:05.988686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.524 [2024-11-30 00:06:05.988703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.525 #30 NEW cov: 11794 ft: 13876 corp: 12/298b lim: 40 exec/s: 0 rss: 68Mb L: 23/32 MS: 1 EraseBytes- 00:07:40.525 [2024-11-30 00:06:06.028871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffb8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.525 [2024-11-30 00:06:06.028895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.525 [2024-11-30 00:06:06.028952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.525 [2024-11-30 00:06:06.028965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.525 [2024-11-30 00:06:06.029016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff60 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.525 [2024-11-30 00:06:06.029029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.525 #31 NEW cov: 11794 ft: 13929 corp: 13/323b lim: 40 exec/s: 0 rss: 68Mb L: 25/32 MS: 1 ChangeByte- 00:07:40.525 [2024-11-30 00:06:06.068963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.525 [2024-11-30 00:06:06.068988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.525 [2024-11-30 00:06:06.069042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.525 [2024-11-30 00:06:06.069055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.525 [2024-11-30 00:06:06.069106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.525 [2024-11-30 00:06:06.069120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.785 #32 NEW cov: 11794 ft: 13974 corp: 14/352b lim: 40 exec/s: 0 rss: 68Mb L: 29/32 MS: 1 ChangeBit- 00:07:40.785 [2024-11-30 00:06:06.109240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.109265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.109318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.109331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.109383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:05000000 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.109397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.109450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e40a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.109463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.785 #33 NEW cov: 11794 ft: 13987 corp: 15/384b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 PersAutoDict- DE: "\005\000\000\000"- 00:07:40.785 [2024-11-30 00:06:06.149053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.149078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.149141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.149161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.785 #39 NEW cov: 11794 ft: 14036 corp: 16/407b lim: 40 exec/s: 0 rss: 68Mb L: 23/32 MS: 1 EraseBytes- 00:07:40.785 [2024-11-30 00:06:06.189324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.189349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.189404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.189418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.189469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:00e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.189483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.785 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.785 #40 NEW cov: 11817 ft: 14111 corp: 17/437b lim: 40 exec/s: 0 rss: 69Mb L: 30/32 MS: 1 InsertByte- 00:07:40.785 [2024-11-30 00:06:06.229453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e40094 cdw11:46fdcea6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.229478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.229533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:3f58e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.229547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.229602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.229614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.785 #41 NEW cov: 11817 ft: 14139 corp: 18/465b lim: 40 exec/s: 0 rss: 69Mb L: 28/32 MS: 1 CMP- DE: "\000\224F\375\316\246?X"- 00:07:40.785 [2024-11-30 00:06:06.269384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.269410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.269466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9446fdce cdw11:a63f58e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.269480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.785 #42 NEW cov: 11817 ft: 14174 corp: 19/488b lim: 40 exec/s: 42 rss: 69Mb L: 23/32 MS: 1 PersAutoDict- DE: "\000\224F\375\316\246?X"- 00:07:40.785 [2024-11-30 00:06:06.309865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.309891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.309946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.309960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.310013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.310027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.785 [2024-11-30 00:06:06.310079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e400 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.785 [2024-11-30 00:06:06.310093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.785 #43 NEW cov: 11817 ft: 14193 corp: 20/525b lim: 40 exec/s: 43 rss: 69Mb L: 37/37 MS: 1 CrossOver- 00:07:41.045 [2024-11-30 00:06:06.349818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e40094 cdw11:46fdcea6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.349843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.349897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:3f58e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.349910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.349962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4f4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.349991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.045 #44 NEW cov: 11817 ft: 14232 corp: 21/553b lim: 40 exec/s: 44 rss: 69Mb L: 28/37 MS: 1 ChangeBit- 00:07:41.045 [2024-11-30 00:06:06.389775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.389801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.389855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.389869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.045 #45 NEW cov: 11817 ft: 14249 corp: 22/574b lim: 40 exec/s: 45 rss: 69Mb L: 21/37 MS: 1 EraseBytes- 00:07:41.045 [2024-11-30 00:06:06.429876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.429901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.429965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e430e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.429984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.045 #46 NEW cov: 11817 ft: 14264 corp: 23/592b lim: 40 exec/s: 46 rss: 69Mb L: 18/37 MS: 1 ChangeByte- 00:07:41.045 [2024-11-30 00:06:06.470325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.470350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.470405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.470419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.470469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.470482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.470535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.470549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.045 #47 NEW cov: 11817 ft: 14280 corp: 24/629b lim: 40 exec/s: 47 rss: 69Mb L: 37/37 MS: 1 CrossOver- 00:07:41.045 [2024-11-30 00:06:06.510416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.510441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.510498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.510512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.510566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.510580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.045 [2024-11-30 00:06:06.510629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.045 [2024-11-30 00:06:06.510643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.045 #48 NEW cov: 11817 ft: 14296 corp: 25/665b lim: 40 exec/s: 48 rss: 69Mb L: 36/37 MS: 1 CrossOver- 00:07:41.046 [2024-11-30 00:06:06.550343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-11-30 00:06:06.550369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 [2024-11-30 00:06:06.550424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9446fdce cdw11:a63f58e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-11-30 00:06:06.550437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.046 [2024-11-30 00:06:06.550489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-11-30 00:06:06.550503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.046 #49 NEW cov: 11817 ft: 14351 corp: 26/690b lim: 40 exec/s: 49 rss: 69Mb L: 25/37 MS: 1 CMP- DE: "\000\005"- 00:07:41.046 [2024-11-30 00:06:06.590461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-11-30 00:06:06.590487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 [2024-11-30 00:06:06.590544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-11-30 00:06:06.590559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.046 [2024-11-30 00:06:06.590610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-11-30 00:06:06.590624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.305 #50 NEW cov: 11817 ft: 14395 corp: 27/719b lim: 40 exec/s: 50 rss: 69Mb L: 29/37 MS: 1 InsertRepeatedBytes- 00:07:41.305 [2024-11-30 00:06:06.630569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.305 [2024-11-30 00:06:06.630595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.305 [2024-11-30 00:06:06.630655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.305 [2024-11-30 00:06:06.630669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.305 [2024-11-30 00:06:06.630721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.305 [2024-11-30 00:06:06.630734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.305 #51 NEW cov: 11817 ft: 14482 corp: 28/747b lim: 40 exec/s: 51 rss: 69Mb L: 28/37 MS: 1 CrossOver- 00:07:41.305 [2024-11-30 00:06:06.670871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.305 [2024-11-30 00:06:06.670897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.305 [2024-11-30 00:06:06.670953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e464 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.305 [2024-11-30 00:06:06.670967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.671017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.671031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.671081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e400e4 cdw11:e4e4e40a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.671095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.306 #52 NEW cov: 11817 ft: 14495 corp: 29/779b lim: 40 exec/s: 52 rss: 70Mb L: 32/37 MS: 1 CopyPart- 00:07:41.306 [2024-11-30 00:06:06.710983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.711011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.711076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.711095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.711151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000018 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.711167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.711223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.711239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.306 #53 NEW cov: 11817 ft: 14497 corp: 30/812b lim: 40 exec/s: 53 rss: 70Mb L: 33/37 MS: 1 CMP- DE: "\000\000\000\030"- 00:07:41.306 [2024-11-30 00:06:06.751089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffb8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.751114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.751169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:009446fd cdw11:cea63f58 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.751182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.751236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.751248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.751302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff60 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.751316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.306 #54 NEW cov: 11817 ft: 14507 corp: 31/845b lim: 40 exec/s: 54 rss: 70Mb L: 33/37 MS: 1 PersAutoDict- DE: "\000\224F\375\316\246?X"- 00:07:41.306 [2024-11-30 00:06:06.790886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.790911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.790966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e430e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.790980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.306 #55 NEW cov: 11817 ft: 14512 corp: 32/863b lim: 40 exec/s: 55 rss: 70Mb L: 18/37 MS: 1 CrossOver- 00:07:41.306 [2024-11-30 00:06:06.831341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffb8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.831366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.831423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:009446fd cdw11:cea63f58 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.831439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.831492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.831506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.306 [2024-11-30 00:06:06.831559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff7fff60 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.306 [2024-11-30 00:06:06.831573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.306 #56 NEW cov: 11817 ft: 14527 corp: 33/896b lim: 40 exec/s: 56 rss: 70Mb L: 33/37 MS: 1 ChangeBit- 00:07:41.566 [2024-11-30 00:06:06.871281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e47a cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.871305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.871363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.871376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.871427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.871440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.566 #57 NEW cov: 11817 ft: 14539 corp: 34/926b lim: 40 exec/s: 57 rss: 70Mb L: 30/37 MS: 1 InsertByte- 00:07:41.566 [2024-11-30 00:06:06.911373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e40094 cdw11:944646fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.911398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.911453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:cea63f58 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.911467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.911521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.911533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.566 #58 NEW cov: 11817 ft: 14554 corp: 35/956b lim: 40 exec/s: 58 rss: 70Mb L: 30/37 MS: 1 CopyPart- 00:07:41.566 [2024-11-30 00:06:06.951643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e40094 cdw11:46fdcea6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.951667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.951722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:3f58e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.951736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.951787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.951804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.951858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e400 cdw11:9446fdce SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.951871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.566 #59 NEW cov: 11817 ft: 14577 corp: 36/992b lim: 40 exec/s: 59 rss: 70Mb L: 36/37 MS: 1 PersAutoDict- DE: "\000\224F\375\316\246?X"- 00:07:41.566 [2024-11-30 00:06:06.991795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffb8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.991820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.991881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:009446fd cdw11:ce000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.991900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.991956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:a63f58ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.991975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:06.992029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:06.992045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.566 #60 NEW cov: 11817 ft: 14585 corp: 37/1028b lim: 40 exec/s: 60 rss: 70Mb L: 36/37 MS: 1 InsertRepeatedBytes- 00:07:41.566 [2024-11-30 00:06:07.031780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:07.031805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:07.031862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:07.031876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:07.031928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:00e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:07.031942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.566 #61 NEW cov: 11817 ft: 14604 corp: 38/1057b lim: 40 exec/s: 61 rss: 70Mb L: 29/37 MS: 1 ShuffleBytes- 00:07:41.566 [2024-11-30 00:06:07.062019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e40094 cdw11:46fdcea6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:07.062044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-11-30 00:06:07.062097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:3f58e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.566 [2024-11-30 00:06:07.062110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.567 [2024-11-30 00:06:07.062161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.567 [2024-11-30 00:06:07.062177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.567 [2024-11-30 00:06:07.062228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e400 cdw11:9446fdce SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.567 [2024-11-30 00:06:07.062241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.567 #62 NEW cov: 11817 ft: 14617 corp: 39/1093b lim: 40 exec/s: 62 rss: 70Mb L: 36/37 MS: 1 CrossOver- 00:07:41.567 [2024-11-30 00:06:07.101963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e40094 cdw11:46fdcea6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.567 [2024-11-30 00:06:07.101988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.567 [2024-11-30 00:06:07.102042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:c1a2e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.567 [2024-11-30 00:06:07.102056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.567 [2024-11-30 00:06:07.102107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.567 [2024-11-30 00:06:07.102121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.567 #63 NEW cov: 11817 ft: 14618 corp: 40/1121b lim: 40 exec/s: 63 rss: 70Mb L: 28/37 MS: 1 ChangeBinInt- 00:07:41.826 [2024-11-30 00:06:07.142242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.142267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 [2024-11-30 00:06:07.142321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e464 cdw11:01000009 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.142335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.826 [2024-11-30 00:06:07.142390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.142405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.826 [2024-11-30 00:06:07.142457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e400e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.142471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.826 #64 NEW cov: 11817 ft: 14633 corp: 41/1157b lim: 40 exec/s: 64 rss: 70Mb L: 36/37 MS: 1 CMP- DE: "\001\000\000\011"- 00:07:41.826 [2024-11-30 00:06:07.181821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2ee40000 cdw11:0018e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.181846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 #67 NEW cov: 11817 ft: 15341 corp: 42/1165b lim: 40 exec/s: 67 rss: 70Mb L: 8/37 MS: 3 CrossOver-InsertByte-PersAutoDict- DE: "\000\000\000\030"- 00:07:41.826 [2024-11-30 00:06:07.222455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.222482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 [2024-11-30 00:06:07.222541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e4e4e464 cdw11:e4e4e4e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-11-30 00:06:07.222555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.826 [2024-11-30 00:06:07.222609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.222622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.222676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e400e4 cdw11:e4e4e40a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.222689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.827 #68 NEW cov: 11817 ft: 15371 corp: 43/1197b lim: 40 exec/s: 68 rss: 70Mb L: 32/37 MS: 1 ChangeByte- 00:07:41.827 [2024-11-30 00:06:07.262590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e4e47d94 cdw11:46fdcea6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.262618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.262683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:3f58e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.262700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.262771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.262785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.262840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e4e4e400 cdw11:9446fdce SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.262853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.827 #69 NEW cov: 11817 ft: 15444 corp: 44/1233b lim: 40 exec/s: 69 rss: 70Mb L: 36/37 MS: 1 ChangeByte- 00:07:41.827 [2024-11-30 00:06:07.302705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffb8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.302730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.302786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0094ffff cdw11:ffa63f58 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.302798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.302852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.302866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.827 [2024-11-30 00:06:07.302918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff60 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-11-30 00:06:07.302931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.827 #70 NEW cov: 11817 ft: 15478 corp: 45/1266b lim: 40 exec/s: 35 rss: 70Mb L: 33/37 MS: 1 CopyPart- 00:07:41.827 #70 DONE cov: 11817 ft: 15478 corp: 45/1266b lim: 40 exec/s: 35 rss: 70Mb 00:07:41.827 ###### Recommended dictionary. ###### 00:07:41.827 "\005\000\000\000" # Uses: 1 00:07:41.827 "\000\224F\375\316\246?X" # Uses: 3 00:07:41.827 "\000\005" # Uses: 0 00:07:41.827 "\000\000\000\030" # Uses: 1 00:07:41.827 "\001\000\000\011" # Uses: 0 00:07:41.827 ###### End of recommended dictionary. ###### 00:07:41.827 Done 70 runs in 2 second(s) 00:07:42.086 00:06:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:42.086 00:06:07 -- ../common.sh@72 -- # (( i++ )) 00:07:42.086 00:06:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.086 00:06:07 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:42.086 00:06:07 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:42.086 00:06:07 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.086 00:06:07 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.086 00:06:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:42.086 00:06:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:42.086 00:06:07 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:42.086 00:06:07 -- nvmf/run.sh@29 -- # port=4413 00:07:42.086 00:06:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:42.086 00:06:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:42.086 00:06:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.086 00:06:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:42.086 [2024-11-30 00:06:07.488286] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:42.086 [2024-11-30 00:06:07.488354] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2726929 ] 00:07:42.086 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.345 [2024-11-30 00:06:07.743078] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.345 [2024-11-30 00:06:07.827057] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.345 [2024-11-30 00:06:07.827218] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.345 [2024-11-30 00:06:07.885391] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.604 [2024-11-30 00:06:07.901766] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:42.604 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.604 INFO: Seed: 260337202 00:07:42.604 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:42.604 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:42.604 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:42.604 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.604 #2 INITED exec/s: 0 rss: 61Mb 00:07:42.604 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.604 This may also happen if the target rejected all inputs we tried so far 00:07:42.604 [2024-11-30 00:06:07.946421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.604 [2024-11-30 00:06:07.946456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.604 [2024-11-30 00:06:07.946490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.604 [2024-11-30 00:06:07.946510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.604 [2024-11-30 00:06:07.946538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.604 [2024-11-30 00:06:07.946553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.864 NEW_FUNC[1/670]: 0x44cd38 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:42.864 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.864 #11 NEW cov: 11578 ft: 11577 corp: 2/27b lim: 40 exec/s: 0 rss: 68Mb L: 26/26 MS: 4 ShuffleBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:42.864 [2024-11-30 00:06:08.267171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.267209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.864 [2024-11-30 00:06:08.267242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.267257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.864 [2024-11-30 00:06:08.267285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.267300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.864 #18 NEW cov: 11691 ft: 12077 corp: 3/52b lim: 40 exec/s: 0 rss: 68Mb L: 25/26 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:42.864 [2024-11-30 00:06:08.317155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.317188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.864 [2024-11-30 00:06:08.317222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.317238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.864 #19 NEW cov: 11697 ft: 12516 corp: 4/73b lim: 40 exec/s: 0 rss: 68Mb L: 21/26 MS: 1 EraseBytes- 00:07:42.864 [2024-11-30 00:06:08.377300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.377330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.864 [2024-11-30 00:06:08.377362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.864 [2024-11-30 00:06:08.377377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.123 #30 NEW cov: 11782 ft: 12892 corp: 5/94b lim: 40 exec/s: 0 rss: 68Mb L: 21/26 MS: 1 ShuffleBytes- 00:07:43.123 [2024-11-30 00:06:08.447631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.447661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.447699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.447715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.447744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:646464ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.447759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.447788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff6464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.447803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.123 #31 NEW cov: 11782 ft: 13485 corp: 6/131b lim: 40 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:43.123 [2024-11-30 00:06:08.517813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.517843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.517877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64640a64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.517893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.517923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.517939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.517968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.517983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.123 #32 NEW cov: 11782 ft: 13610 corp: 7/169b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:07:43.123 [2024-11-30 00:06:08.587952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.587981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.588015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.588030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.588058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b0a7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.588073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.588100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.588115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.123 #33 NEW cov: 11782 ft: 13656 corp: 8/208b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:07:43.123 [2024-11-30 00:06:08.658025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a6464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.658054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.123 [2024-11-30 00:06:08.658086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.123 [2024-11-30 00:06:08.658101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.383 #34 NEW cov: 11782 ft: 13707 corp: 9/226b lim: 40 exec/s: 0 rss: 69Mb L: 18/39 MS: 1 CrossOver- 00:07:43.383 [2024-11-30 00:06:08.718221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.718251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.718283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.718298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.383 #35 NEW cov: 11782 ft: 13783 corp: 10/248b lim: 40 exec/s: 0 rss: 69Mb L: 22/39 MS: 1 CrossOver- 00:07:43.383 [2024-11-30 00:06:08.769268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.769297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.769370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64640a64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.769391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.769462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.769483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.769551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.769568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.383 #36 NEW cov: 11782 ft: 13899 corp: 11/286b lim: 40 exec/s: 0 rss: 69Mb L: 38/39 MS: 1 ChangeByte- 00:07:43.383 [2024-11-30 00:06:08.809335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.809362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.809435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.809455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.809523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b0a7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.809542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.809615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.809631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.383 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.383 #37 NEW cov: 11799 ft: 13936 corp: 12/325b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:43.383 [2024-11-30 00:06:08.859237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.859265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.859335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.859357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.383 #38 NEW cov: 11799 ft: 13964 corp: 13/346b lim: 40 exec/s: 0 rss: 69Mb L: 21/39 MS: 1 ChangeByte- 00:07:43.383 [2024-11-30 00:06:08.899358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.899385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.383 [2024-11-30 00:06:08.899455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:15007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.383 [2024-11-30 00:06:08.899476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.383 #39 NEW cov: 11799 ft: 14009 corp: 14/367b lim: 40 exec/s: 0 rss: 69Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:43.644 [2024-11-30 00:06:08.939767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:08.939795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:08.939868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:08.939899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:08.939967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:08.939983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:08.940049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:08.940065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.644 #40 NEW cov: 11799 ft: 14036 corp: 15/405b lim: 40 exec/s: 40 rss: 69Mb L: 38/39 MS: 1 CopyPart- 00:07:43.644 [2024-11-30 00:06:08.979638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b307b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:08.979665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:08.979740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b0a7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:08.979774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.644 #41 NEW cov: 11799 ft: 14055 corp: 16/428b lim: 40 exec/s: 41 rss: 69Mb L: 23/39 MS: 1 InsertByte- 00:07:43.644 [2024-11-30 00:06:09.019903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.019928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.019959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.019976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.020003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b0a7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.020036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.020104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.020124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.644 #42 NEW cov: 11799 ft: 14184 corp: 17/467b lim: 40 exec/s: 42 rss: 69Mb L: 39/39 MS: 1 ChangeByte- 00:07:43.644 [2024-11-30 00:06:09.059727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a6464 cdw11:64640a64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.059754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 #43 NEW cov: 11799 ft: 14589 corp: 18/479b lim: 40 exec/s: 43 rss: 69Mb L: 12/39 MS: 1 CrossOver- 00:07:43.644 [2024-11-30 00:06:09.100209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b0a cdw11:0a646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.100235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.100306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.100325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.100392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:647b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.100408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.100475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.100491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.644 #49 NEW cov: 11799 ft: 14627 corp: 19/518b lim: 40 exec/s: 49 rss: 69Mb L: 39/39 MS: 1 CrossOver- 00:07:43.644 [2024-11-30 00:06:09.140230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a640000 cdw11:00006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.140260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.140332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.140352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.140418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.140434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.644 #50 NEW cov: 11799 ft: 14633 corp: 20/548b lim: 40 exec/s: 50 rss: 69Mb L: 30/39 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:43.644 [2024-11-30 00:06:09.180196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.180222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.644 [2024-11-30 00:06:09.180293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:15007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.644 [2024-11-30 00:06:09.180315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 #51 NEW cov: 11799 ft: 14667 corp: 21/569b lim: 40 exec/s: 51 rss: 69Mb L: 21/39 MS: 1 ShuffleBytes- 00:07:43.905 [2024-11-30 00:06:09.220313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.220339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.220409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b0a7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.220430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 #52 NEW cov: 11799 ft: 14684 corp: 22/591b lim: 40 exec/s: 52 rss: 69Mb L: 22/39 MS: 1 CrossOver- 00:07:43.905 [2024-11-30 00:06:09.260531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b0a cdw11:0a646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.260557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.260633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.260652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.260720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:647b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.260737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.905 #53 NEW cov: 11799 ft: 14700 corp: 23/622b lim: 40 exec/s: 53 rss: 69Mb L: 31/39 MS: 1 EraseBytes- 00:07:43.905 [2024-11-30 00:06:09.300896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:6464647b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.300922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.301000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.301036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.301106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0a646464 cdw11:6464ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.301123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.301190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.301206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.301275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ff646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.301291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.905 #54 NEW cov: 11799 ft: 14753 corp: 24/662b lim: 40 exec/s: 54 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:43.905 [2024-11-30 00:06:09.340678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b6b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.340704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.340776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:15007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.340797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 #55 NEW cov: 11799 ft: 14761 corp: 25/683b lim: 40 exec/s: 55 rss: 69Mb L: 21/40 MS: 1 ChangeBit- 00:07:43.905 [2024-11-30 00:06:09.381045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.381073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.381143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.381164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.381232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.381249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.381315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:217b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.381333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.905 #56 NEW cov: 11799 ft: 14778 corp: 26/721b lim: 40 exec/s: 56 rss: 70Mb L: 38/40 MS: 1 ChangeByte- 00:07:43.905 [2024-11-30 00:06:09.421128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.421158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.421228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.421248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.421316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:646464ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.421332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.905 [2024-11-30 00:06:09.421401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffff32 cdw11:ffff6464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.905 [2024-11-30 00:06:09.421416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.905 #57 NEW cov: 11799 ft: 14810 corp: 27/758b lim: 40 exec/s: 57 rss: 70Mb L: 37/40 MS: 1 ChangeByte- 00:07:44.165 [2024-11-30 00:06:09.461115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.461143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.165 [2024-11-30 00:06:09.461214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7bc1c1 cdw11:c1c1c1c1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.461235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.165 [2024-11-30 00:06:09.461314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:c1c17b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.461330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.165 #58 NEW cov: 11799 ft: 14829 corp: 28/788b lim: 40 exec/s: 58 rss: 70Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:07:44.165 [2024-11-30 00:06:09.501342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.501369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.165 [2024-11-30 00:06:09.501438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64640a64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.501459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.165 [2024-11-30 00:06:09.501525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.501541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.165 [2024-11-30 00:06:09.501611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.165 [2024-11-30 00:06:09.501627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.165 #59 NEW cov: 11799 ft: 14849 corp: 29/827b lim: 40 exec/s: 59 rss: 70Mb L: 39/40 MS: 1 InsertByte- 00:07:44.166 [2024-11-30 00:06:09.541456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.541486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.541559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.541580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.541658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.541677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.541748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b217b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.541767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.166 #60 NEW cov: 11799 ft: 14851 corp: 30/863b lim: 40 exec/s: 60 rss: 70Mb L: 36/40 MS: 1 EraseBytes- 00:07:44.166 [2024-11-30 00:06:09.581442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.581469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.581540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.581560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.581649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.581669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.166 #61 NEW cov: 11799 ft: 14888 corp: 31/893b lim: 40 exec/s: 61 rss: 70Mb L: 30/40 MS: 1 CrossOver- 00:07:44.166 [2024-11-30 00:06:09.621711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.621737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.621808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:6464b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.621828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.621896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.621912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.621980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b6b6b664 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.621995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.166 #62 NEW cov: 11799 ft: 14899 corp: 32/932b lim: 40 exec/s: 62 rss: 70Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:07:44.166 [2024-11-30 00:06:09.661699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.661729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.661799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.661820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.661891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.661910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.166 #63 NEW cov: 11799 ft: 14915 corp: 33/963b lim: 40 exec/s: 63 rss: 70Mb L: 31/40 MS: 1 CrossOver- 00:07:44.166 [2024-11-30 00:06:09.701665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a6464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.701692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.166 [2024-11-30 00:06:09.701764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.166 [2024-11-30 00:06:09.701783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 #64 NEW cov: 11799 ft: 14921 corp: 34/981b lim: 40 exec/s: 64 rss: 70Mb L: 18/40 MS: 1 ShuffleBytes- 00:07:44.425 [2024-11-30 00:06:09.742024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.742052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.742124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.742146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.742214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:646464ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.742230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.742295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff6464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.742311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.425 #65 NEW cov: 11799 ft: 14928 corp: 35/1018b lim: 40 exec/s: 65 rss: 70Mb L: 37/40 MS: 1 ShuffleBytes- 00:07:44.425 [2024-11-30 00:06:09.782125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.782151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.782221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:85848485 cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.782240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.782309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b0a7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.782326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.782392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.782408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.425 #66 NEW cov: 11799 ft: 14956 corp: 36/1057b lim: 40 exec/s: 66 rss: 70Mb L: 39/40 MS: 1 ChangeBinInt- 00:07:44.425 [2024-11-30 00:06:09.822220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a642464 cdw11:64646464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.822247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.822317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:64646464 cdw11:64640a64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.822339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.822406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.822422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.822488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff64 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.822506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.425 #67 NEW cov: 11806 ft: 14998 corp: 37/1095b lim: 40 exec/s: 67 rss: 70Mb L: 38/40 MS: 1 ChangeBit- 00:07:44.425 [2024-11-30 00:06:09.862099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.862126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.862197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.862217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 #68 NEW cov: 11806 ft: 15011 corp: 38/1117b lim: 40 exec/s: 68 rss: 70Mb L: 22/40 MS: 1 InsertByte- 00:07:44.425 [2024-11-30 00:06:09.902378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.902405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.902475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b81 cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.902496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.902565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.902581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.425 #69 NEW cov: 11806 ft: 15014 corp: 39/1147b lim: 40 exec/s: 69 rss: 70Mb L: 30/40 MS: 1 ChangeBinInt- 00:07:44.425 [2024-11-30 00:06:09.942603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.942631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.942702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:85848485 cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.942724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.942791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b0a7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.942807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.425 [2024-11-30 00:06:09.942874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.425 [2024-11-30 00:06:09.942890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.425 #70 NEW cov: 11806 ft: 15026 corp: 40/1186b lim: 40 exec/s: 35 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:07:44.425 #70 DONE cov: 11806 ft: 15026 corp: 40/1186b lim: 40 exec/s: 35 rss: 70Mb 00:07:44.425 ###### Recommended dictionary. ###### 00:07:44.425 "\000\000\000\000" # Uses: 0 00:07:44.425 ###### End of recommended dictionary. ###### 00:07:44.425 Done 70 runs in 2 second(s) 00:07:44.684 00:06:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:44.684 00:06:10 -- ../common.sh@72 -- # (( i++ )) 00:07:44.684 00:06:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.684 00:06:10 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:44.684 00:06:10 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:44.684 00:06:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.684 00:06:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.685 00:06:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:44.685 00:06:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:44.685 00:06:10 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:44.685 00:06:10 -- nvmf/run.sh@29 -- # port=4414 00:07:44.685 00:06:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:44.685 00:06:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:44.685 00:06:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.685 00:06:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:44.685 [2024-11-30 00:06:10.138807] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:44.685 [2024-11-30 00:06:10.138877] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2727477 ] 00:07:44.685 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.944 [2024-11-30 00:06:10.394481] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.944 [2024-11-30 00:06:10.480765] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.944 [2024-11-30 00:06:10.480910] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.202 [2024-11-30 00:06:10.539621] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.202 [2024-11-30 00:06:10.555983] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:45.202 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.202 INFO: Seed: 2914358986 00:07:45.202 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:45.202 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:45.202 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:45.202 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.202 #2 INITED exec/s: 0 rss: 60Mb 00:07:45.202 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.202 This may also happen if the target rejected all inputs we tried so far 00:07:45.202 [2024-11-30 00:06:10.633908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.202 [2024-11-30 00:06:10.633948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.202 [2024-11-30 00:06:10.634029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.202 [2024-11-30 00:06:10.634045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.462 NEW_FUNC[1/673]: 0x44e908 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:45.462 NEW_FUNC[2/673]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:45.462 #29 NEW cov: 11605 ft: 11606 corp: 2/23b lim: 35 exec/s: 0 rss: 68Mb L: 22/22 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:45.462 [2024-11-30 00:06:10.953465] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.462 [2024-11-30 00:06:10.953503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.462 NEW_FUNC[1/2]: 0x46a548 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:07:45.462 NEW_FUNC[2/2]: 0x112d178 in nvmf_ctrlr_set_features_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1503 00:07:45.462 #30 NEW cov: 11768 ft: 12556 corp: 3/45b lim: 35 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeBit- 00:07:45.462 [2024-11-30 00:06:11.003758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.462 [2024-11-30 00:06:11.003794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.462 [2024-11-30 00:06:11.003920] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000094 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.462 [2024-11-30 00:06:11.003940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.462 [2024-11-30 00:06:11.004080] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.462 [2024-11-30 00:06:11.004098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.462 [2024-11-30 00:06:11.004229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.462 [2024-11-30 00:06:11.004252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.721 #31 NEW cov: 11781 ft: 13138 corp: 4/75b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CMP- DE: "\315x\341\223\000G\224\000"- 00:07:45.721 [2024-11-30 00:06:11.043666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.721 [2024-11-30 00:06:11.043694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.721 [2024-11-30 00:06:11.043820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.721 [2024-11-30 00:06:11.043839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.721 #32 NEW cov: 11866 ft: 13369 corp: 5/97b lim: 35 exec/s: 0 rss: 68Mb L: 22/30 MS: 1 ChangeByte- 00:07:45.722 [2024-11-30 00:06:11.083782] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.083813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.722 [2024-11-30 00:06:11.083943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.083960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.722 #33 NEW cov: 11866 ft: 13478 corp: 6/119b lim: 35 exec/s: 0 rss: 68Mb L: 22/30 MS: 1 ChangeBit- 00:07:45.722 #34 NEW cov: 11866 ft: 14170 corp: 7/127b lim: 35 exec/s: 0 rss: 68Mb L: 8/30 MS: 1 CrossOver- 00:07:45.722 [2024-11-30 00:06:11.183722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.183751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.722 [2024-11-30 00:06:11.183882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.183900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.722 #38 NEW cov: 11866 ft: 14266 corp: 8/147b lim: 35 exec/s: 0 rss: 68Mb L: 20/30 MS: 4 ChangeByte-ChangeBit-CopyPart-CrossOver- 00:07:45.722 [2024-11-30 00:06:11.223940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.223968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.722 [2024-11-30 00:06:11.224093] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.224113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.722 #39 NEW cov: 11866 ft: 14320 corp: 9/167b lim: 35 exec/s: 0 rss: 68Mb L: 20/30 MS: 1 ShuffleBytes- 00:07:45.722 [2024-11-30 00:06:11.274114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.274144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.722 [2024-11-30 00:06:11.274269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.722 [2024-11-30 00:06:11.274286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 #40 NEW cov: 11866 ft: 14344 corp: 10/187b lim: 35 exec/s: 0 rss: 68Mb L: 20/30 MS: 1 ChangeASCIIInt- 00:07:45.982 [2024-11-30 00:06:11.314642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.314672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.314784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.314804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.314927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.314945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.315077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.315096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.982 #41 NEW cov: 11866 ft: 14411 corp: 11/218b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 CopyPart- 00:07:45.982 [2024-11-30 00:06:11.364717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.364746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.364866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.364883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.982 #42 NEW cov: 11866 ft: 14430 corp: 12/240b lim: 35 exec/s: 0 rss: 69Mb L: 22/31 MS: 1 ChangeBinInt- 00:07:45.982 [2024-11-30 00:06:11.414520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.414550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.414682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.414699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 #46 NEW cov: 11866 ft: 14460 corp: 13/260b lim: 35 exec/s: 0 rss: 69Mb L: 20/31 MS: 4 CopyPart-ChangeBit-ChangeBit-CrossOver- 00:07:45.982 [2024-11-30 00:06:11.454684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.454712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 #47 NEW cov: 11866 ft: 14500 corp: 14/278b lim: 35 exec/s: 0 rss: 69Mb L: 18/31 MS: 1 EraseBytes- 00:07:45.982 [2024-11-30 00:06:11.494671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.494698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.494833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.494848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.982 #48 NEW cov: 11889 ft: 14582 corp: 15/298b lim: 35 exec/s: 0 rss: 69Mb L: 20/31 MS: 1 ChangeASCIIInt- 00:07:45.982 [2024-11-30 00:06:11.535412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.535444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.535579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.535595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.535717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.535735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.982 [2024-11-30 00:06:11.535869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.982 [2024-11-30 00:06:11.535885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.242 #49 NEW cov: 11889 ft: 14606 corp: 16/332b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:46.242 [2024-11-30 00:06:11.574717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.574749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.242 NEW_FUNC[1/1]: 0x4691c8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:46.242 #50 NEW cov: 11923 ft: 14691 corp: 17/340b lim: 35 exec/s: 50 rss: 69Mb L: 8/34 MS: 1 CMP- DE: "\001\224G\006.\025\376\304"- 00:07:46.242 [2024-11-30 00:06:11.624757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.624785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.242 #55 NEW cov: 11923 ft: 14719 corp: 18/348b lim: 35 exec/s: 55 rss: 69Mb L: 8/34 MS: 5 EraseBytes-EraseBytes-ChangeBinInt-ChangeByte-CopyPart- 00:07:46.242 [2024-11-30 00:06:11.665125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:4 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.665154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.242 [2024-11-30 00:06:11.665287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:5 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.665306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.242 #57 NEW cov: 11923 ft: 14729 corp: 19/365b lim: 35 exec/s: 57 rss: 69Mb L: 17/34 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:46.242 #58 NEW cov: 11923 ft: 14744 corp: 20/376b lim: 35 exec/s: 58 rss: 69Mb L: 11/34 MS: 1 EraseBytes- 00:07:46.242 [2024-11-30 00:06:11.746000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.746030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.242 [2024-11-30 00:06:11.746164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.746183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.242 [2024-11-30 00:06:11.746296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.746315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.242 [2024-11-30 00:06:11.746463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.746490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.242 #59 NEW cov: 11923 ft: 14757 corp: 21/409b lim: 35 exec/s: 59 rss: 69Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:07:46.242 [2024-11-30 00:06:11.785797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.785827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.242 [2024-11-30 00:06:11.785965] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.785983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.242 [2024-11-30 00:06:11.786104] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.242 [2024-11-30 00:06:11.786123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.511 #60 NEW cov: 11923 ft: 14874 corp: 22/430b lim: 35 exec/s: 60 rss: 69Mb L: 21/34 MS: 1 InsertByte- 00:07:46.511 [2024-11-30 00:06:11.835631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.835661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.835787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.835805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.511 #61 NEW cov: 11923 ft: 14960 corp: 23/448b lim: 35 exec/s: 61 rss: 69Mb L: 18/34 MS: 1 EraseBytes- 00:07:46.511 [2024-11-30 00:06:11.875826] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:4 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.875854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.875975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:5 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.875996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.511 #62 NEW cov: 11923 ft: 14975 corp: 24/465b lim: 35 exec/s: 62 rss: 69Mb L: 17/34 MS: 1 ChangeBinInt- 00:07:46.511 [2024-11-30 00:06:11.916155] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.916183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.916316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.916334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.916467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.916482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.511 #63 NEW cov: 11923 ft: 14978 corp: 25/486b lim: 35 exec/s: 63 rss: 69Mb L: 21/34 MS: 1 InsertByte- 00:07:46.511 [2024-11-30 00:06:11.956573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.956613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.956744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000094 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.956762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.956895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.956913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:11.957036] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:11.957052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.511 #64 NEW cov: 11923 ft: 14988 corp: 26/519b lim: 35 exec/s: 64 rss: 69Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:07:46.511 [2024-11-30 00:06:12.006831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:12.006858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:12.007001] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:12.007019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.511 [2024-11-30 00:06:12.007144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:12.007162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.511 #65 NEW cov: 11923 ft: 15001 corp: 27/548b lim: 35 exec/s: 65 rss: 69Mb L: 29/34 MS: 1 InsertRepeatedBytes- 00:07:46.511 [2024-11-30 00:06:12.046098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.511 [2024-11-30 00:06:12.046132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.773 #68 NEW cov: 11923 ft: 15021 corp: 28/557b lim: 35 exec/s: 68 rss: 69Mb L: 9/34 MS: 3 CopyPart-ChangeBit-CMP- DE: "\343\307 \002\000\000\000\000"- 00:07:46.773 [2024-11-30 00:06:12.086420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.773 [2024-11-30 00:06:12.086447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.773 [2024-11-30 00:06:12.086571] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.773 [2024-11-30 00:06:12.086587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.773 #69 NEW cov: 11923 ft: 15042 corp: 29/577b lim: 35 exec/s: 69 rss: 69Mb L: 20/34 MS: 1 ShuffleBytes- 00:07:46.773 #70 NEW cov: 11923 ft: 15069 corp: 30/585b lim: 35 exec/s: 70 rss: 69Mb L: 8/34 MS: 1 ShuffleBytes- 00:07:46.773 [2024-11-30 00:06:12.167421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.773 [2024-11-30 00:06:12.167450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.773 [2024-11-30 00:06:12.167581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.773 [2024-11-30 00:06:12.167601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.773 [2024-11-30 00:06:12.167723] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.773 [2024-11-30 00:06:12.167742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.774 #71 NEW cov: 11923 ft: 15082 corp: 31/615b lim: 35 exec/s: 71 rss: 69Mb L: 30/34 MS: 1 CopyPart- 00:07:46.774 [2024-11-30 00:06:12.207105] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000cd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.207135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.207263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.207280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.207407] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.207423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.774 #72 NEW cov: 11923 ft: 15098 corp: 32/637b lim: 35 exec/s: 72 rss: 70Mb L: 22/34 MS: 1 PersAutoDict- DE: "\315x\341\223\000G\224\000"- 00:07:46.774 [2024-11-30 00:06:12.247837] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000058 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.247865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.247995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.248015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.248145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.248163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.248292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.248309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.774 #73 NEW cov: 11923 ft: 15259 corp: 33/672b lim: 35 exec/s: 73 rss: 70Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:46.774 [2024-11-30 00:06:12.297805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.297833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.297964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.297982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.774 [2024-11-30 00:06:12.298075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.774 [2024-11-30 00:06:12.298097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.774 #74 NEW cov: 11923 ft: 15276 corp: 34/701b lim: 35 exec/s: 74 rss: 70Mb L: 29/35 MS: 1 CMP- DE: "\001\000"- 00:07:47.033 [2024-11-30 00:06:12.347525] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.033 [2024-11-30 00:06:12.347552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.033 [2024-11-30 00:06:12.347690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.033 [2024-11-30 00:06:12.347712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.033 [2024-11-30 00:06:12.347836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.033 [2024-11-30 00:06:12.347854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.033 #75 NEW cov: 11923 ft: 15309 corp: 35/722b lim: 35 exec/s: 75 rss: 70Mb L: 21/35 MS: 1 ChangeBinInt- 00:07:47.033 #76 NEW cov: 11923 ft: 15323 corp: 36/735b lim: 35 exec/s: 76 rss: 70Mb L: 13/35 MS: 1 EraseBytes- 00:07:47.033 [2024-11-30 00:06:12.428348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000a7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.033 [2024-11-30 00:06:12.428380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.034 [2024-11-30 00:06:12.428502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.428520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.034 [2024-11-30 00:06:12.428647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.428666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.034 [2024-11-30 00:06:12.428792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.428810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.034 #77 NEW cov: 11923 ft: 15340 corp: 37/770b lim: 35 exec/s: 77 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:47.034 [2024-11-30 00:06:12.477958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:4 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.477987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.034 [2024-11-30 00:06:12.478112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NON OPERATIONAL POWER STATE CONFIG cid:5 cdw10:00000011 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.478134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.034 [2024-11-30 00:06:12.478263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000094 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.478281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.034 #78 NEW cov: 11923 ft: 15349 corp: 38/795b lim: 35 exec/s: 78 rss: 70Mb L: 25/35 MS: 1 PersAutoDict- DE: "\001\224G\006.\025\376\304"- 00:07:47.034 [2024-11-30 00:06:12.517499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.517537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.034 #79 NEW cov: 11923 ft: 15363 corp: 39/803b lim: 35 exec/s: 79 rss: 70Mb L: 8/35 MS: 1 ChangeBit- 00:07:47.034 [2024-11-30 00:06:12.568371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.568400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.034 [2024-11-30 00:06:12.568520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000020 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.034 [2024-11-30 00:06:12.568548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.034 #80 NEW cov: 11923 ft: 15383 corp: 40/825b lim: 35 exec/s: 80 rss: 70Mb L: 22/35 MS: 1 CMP- DE: " \000"- 00:07:47.292 [2024-11-30 00:06:12.608185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.292 [2024-11-30 00:06:12.608214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.292 #81 NEW cov: 11923 ft: 15387 corp: 41/840b lim: 35 exec/s: 40 rss: 70Mb L: 15/35 MS: 1 PersAutoDict- DE: " \000"- 00:07:47.292 #81 DONE cov: 11923 ft: 15387 corp: 41/840b lim: 35 exec/s: 40 rss: 70Mb 00:07:47.292 ###### Recommended dictionary. ###### 00:07:47.292 "\315x\341\223\000G\224\000" # Uses: 1 00:07:47.292 "\001\224G\006.\025\376\304" # Uses: 1 00:07:47.292 "\343\307 \002\000\000\000\000" # Uses: 0 00:07:47.292 "\001\000" # Uses: 0 00:07:47.292 " \000" # Uses: 1 00:07:47.292 ###### End of recommended dictionary. ###### 00:07:47.292 Done 81 runs in 2 second(s) 00:07:47.292 00:06:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:47.292 00:06:12 -- ../common.sh@72 -- # (( i++ )) 00:07:47.292 00:06:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.292 00:06:12 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:47.292 00:06:12 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:47.292 00:06:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.292 00:06:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.292 00:06:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:47.292 00:06:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:47.292 00:06:12 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:47.292 00:06:12 -- nvmf/run.sh@29 -- # port=4415 00:07:47.292 00:06:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:47.292 00:06:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:47.292 00:06:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.292 00:06:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:47.292 [2024-11-30 00:06:12.798998] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:47.292 [2024-11-30 00:06:12.799065] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2727905 ] 00:07:47.292 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.551 [2024-11-30 00:06:12.985199] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.551 [2024-11-30 00:06:13.052049] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.551 [2024-11-30 00:06:13.052185] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.812 [2024-11-30 00:06:13.110261] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.812 [2024-11-30 00:06:13.126592] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:47.812 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.812 INFO: Seed: 1189386211 00:07:47.812 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:47.812 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:47.812 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:47.812 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.812 #2 INITED exec/s: 0 rss: 60Mb 00:07:47.812 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.812 This may also happen if the target rejected all inputs we tried so far 00:07:47.812 [2024-11-30 00:06:13.171293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.812 [2024-11-30 00:06:13.171327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.812 [2024-11-30 00:06:13.171360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.812 [2024-11-30 00:06:13.171375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.812 [2024-11-30 00:06:13.171404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.812 [2024-11-30 00:06:13.171418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.070 NEW_FUNC[1/670]: 0x44fe48 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:48.070 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.070 #12 NEW cov: 11559 ft: 11556 corp: 2/28b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 5 InsertByte-ChangeBit-CopyPart-CMP-InsertRepeatedBytes- DE: ">\000"- 00:07:48.071 [2024-11-30 00:06:13.492082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.492120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.492153] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.492168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.492197] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.492211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.492239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.492253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.071 #13 NEW cov: 11672 ft: 12378 corp: 3/56b lim: 35 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 InsertByte- 00:07:48.071 [2024-11-30 00:06:13.562193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.562224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.562256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.562275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.562304] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.562318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.562346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.562361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.071 #14 NEW cov: 11678 ft: 12772 corp: 4/85b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 CrossOver- 00:07:48.071 [2024-11-30 00:06:13.622334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.622365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.622400] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.622417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.071 [2024-11-30 00:06:13.622447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.071 [2024-11-30 00:06:13.622463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.330 #15 NEW cov: 11763 ft: 13038 corp: 5/112b lim: 35 exec/s: 0 rss: 68Mb L: 27/29 MS: 1 CopyPart- 00:07:48.330 [2024-11-30 00:06:13.682394] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.682424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.682456] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.682471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.682499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.682514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.330 #16 NEW cov: 11763 ft: 13124 corp: 6/139b lim: 35 exec/s: 0 rss: 68Mb L: 27/29 MS: 1 ChangeByte- 00:07:48.330 [2024-11-30 00:06:13.742643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.742673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.742705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.742736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.742766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.742781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.742810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.742829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.330 #17 NEW cov: 11763 ft: 13234 corp: 7/167b lim: 35 exec/s: 0 rss: 68Mb L: 28/29 MS: 1 PersAutoDict- DE: ">\000"- 00:07:48.330 [2024-11-30 00:06:13.792789] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.792819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.792851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.792867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.792895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.792910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.792955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.792970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.330 #18 NEW cov: 11763 ft: 13391 corp: 8/195b lim: 35 exec/s: 0 rss: 68Mb L: 28/29 MS: 1 InsertByte- 00:07:48.330 [2024-11-30 00:06:13.852937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.852970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.853004] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.853021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.853051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.853066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.330 [2024-11-30 00:06:13.853096] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.330 [2024-11-30 00:06:13.853111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.590 #19 NEW cov: 11763 ft: 13446 corp: 9/224b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 CrossOver- 00:07:48.590 [2024-11-30 00:06:13.903059] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.590 [2024-11-30 00:06:13.903088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.590 [2024-11-30 00:06:13.903121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.590 [2024-11-30 00:06:13.903136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.590 [2024-11-30 00:06:13.903164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.590 [2024-11-30 00:06:13.903178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.590 [2024-11-30 00:06:13.903206] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.590 [2024-11-30 00:06:13.903224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.590 #20 NEW cov: 11763 ft: 13562 corp: 10/252b lim: 35 exec/s: 0 rss: 68Mb L: 28/29 MS: 1 ChangeBinInt- 00:07:48.590 [2024-11-30 00:06:13.973275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:13.973305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:13.973337] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:13.973352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:13.973380] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:13.973394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:13.973423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:13.973437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.591 #21 NEW cov: 11763 ft: 13601 corp: 11/281b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 PersAutoDict- DE: ">\000"- 00:07:48.591 [2024-11-30 00:06:14.043428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.043460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:14.043492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.043522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:14.043552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.043567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:14.043604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.043619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.591 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.591 #22 NEW cov: 11780 ft: 13677 corp: 12/309b lim: 35 exec/s: 0 rss: 69Mb L: 28/29 MS: 1 ChangeBinInt- 00:07:48.591 [2024-11-30 00:06:14.113656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.113698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:14.113730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.113745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:14.113773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.113791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.591 [2024-11-30 00:06:14.113819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.591 [2024-11-30 00:06:14.113833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.851 #23 NEW cov: 11780 ft: 13709 corp: 13/338b lim: 35 exec/s: 23 rss: 69Mb L: 29/29 MS: 1 ShuffleBytes- 00:07:48.851 [2024-11-30 00:06:14.183830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.183861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.183895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.183911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.183941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.183956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.183985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.184000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.851 #24 NEW cov: 11780 ft: 13752 corp: 14/367b lim: 35 exec/s: 24 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:07:48.851 [2024-11-30 00:06:14.253932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.253963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.253995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.254009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.254038] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.254052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.851 #25 NEW cov: 11780 ft: 13818 corp: 15/392b lim: 35 exec/s: 25 rss: 69Mb L: 25/29 MS: 1 EraseBytes- 00:07:48.851 [2024-11-30 00:06:14.304103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.304134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.304166] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.304196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.304226] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.304241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.851 [2024-11-30 00:06:14.304270] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.304289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.851 #26 NEW cov: 11780 ft: 13878 corp: 16/422b lim: 35 exec/s: 26 rss: 69Mb L: 30/30 MS: 1 InsertByte- 00:07:48.851 [2024-11-30 00:06:14.374172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.851 [2024-11-30 00:06:14.374203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.111 #27 NEW cov: 11780 ft: 14316 corp: 17/435b lim: 35 exec/s: 27 rss: 69Mb L: 13/30 MS: 1 EraseBytes- 00:07:49.111 [2024-11-30 00:06:14.444493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.111 [2024-11-30 00:06:14.444524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.111 [2024-11-30 00:06:14.444556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.111 [2024-11-30 00:06:14.444571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.111 [2024-11-30 00:06:14.444607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.111 [2024-11-30 00:06:14.444621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.111 [2024-11-30 00:06:14.444650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.444664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.112 #28 NEW cov: 11780 ft: 14335 corp: 18/464b lim: 35 exec/s: 28 rss: 69Mb L: 29/30 MS: 1 ShuffleBytes- 00:07:49.112 [2024-11-30 00:06:14.494559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.494590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.494630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.494646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.494674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.494689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.494717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000372 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.494731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.112 #29 NEW cov: 11780 ft: 14377 corp: 19/495b lim: 35 exec/s: 29 rss: 69Mb L: 31/31 MS: 1 CopyPart- 00:07:49.112 [2024-11-30 00:06:14.564788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.564818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.564851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.564882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.564916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.564931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.564961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000006b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.564976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.112 #30 NEW cov: 11780 ft: 14450 corp: 20/523b lim: 35 exec/s: 30 rss: 69Mb L: 28/31 MS: 1 ChangeByte- 00:07:49.112 [2024-11-30 00:06:14.635000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.635031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.635065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.635081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.635110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.635125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.112 [2024-11-30 00:06:14.635155] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.112 [2024-11-30 00:06:14.635170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.372 #31 NEW cov: 11780 ft: 14460 corp: 21/557b lim: 35 exec/s: 31 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:49.372 [2024-11-30 00:06:14.684967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.684996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.372 [2024-11-30 00:06:14.685028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.685043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.372 #32 NEW cov: 11780 ft: 14684 corp: 22/575b lim: 35 exec/s: 32 rss: 69Mb L: 18/34 MS: 1 EraseBytes- 00:07:49.372 [2024-11-30 00:06:14.745216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.745246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.372 [2024-11-30 00:06:14.745278] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.745293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.372 [2024-11-30 00:06:14.745321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.745335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.372 [2024-11-30 00:06:14.745363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.745383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.372 #33 NEW cov: 11780 ft: 14697 corp: 23/606b lim: 35 exec/s: 33 rss: 69Mb L: 31/34 MS: 1 CopyPart- 00:07:49.372 [2024-11-30 00:06:14.786373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.786400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.372 [2024-11-30 00:06:14.786475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.372 [2024-11-30 00:06:14.786496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.786567] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.786583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.786663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.786684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.786755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.786771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.373 #34 NEW cov: 11780 ft: 14868 corp: 24/641b lim: 35 exec/s: 34 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:49.373 [2024-11-30 00:06:14.826363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.826390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.826465] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.826484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.826555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.826572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.826666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.826683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.373 #35 NEW cov: 11780 ft: 15006 corp: 25/670b lim: 35 exec/s: 35 rss: 69Mb L: 29/35 MS: 1 ChangeByte- 00:07:49.373 [2024-11-30 00:06:14.876691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.876718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.876793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.876815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.876890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.876913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.876986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.877004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.877078] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.877096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.373 #36 NEW cov: 11780 ft: 15020 corp: 26/705b lim: 35 exec/s: 36 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:49.373 [2024-11-30 00:06:14.926707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.926734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.926811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.926832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.926906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.926922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.373 [2024-11-30 00:06:14.926992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000169 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.373 [2024-11-30 00:06:14.927009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.631 #37 NEW cov: 11780 ft: 15027 corp: 27/739b lim: 35 exec/s: 37 rss: 69Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:49.631 [2024-11-30 00:06:14.966676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:14.966702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.631 [2024-11-30 00:06:14.966776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:14.966796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.631 [2024-11-30 00:06:14.966870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:14.966886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.631 #39 NEW cov: 11780 ft: 15036 corp: 28/763b lim: 35 exec/s: 39 rss: 69Mb L: 24/35 MS: 2 ShuffleBytes-CrossOver- 00:07:49.631 [2024-11-30 00:06:15.006888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:15.006914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.631 [2024-11-30 00:06:15.006988] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:15.007010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.631 [2024-11-30 00:06:15.007079] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:15.007099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.631 [2024-11-30 00:06:15.007172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:15.007189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.631 #40 NEW cov: 11780 ft: 15047 corp: 29/793b lim: 35 exec/s: 40 rss: 69Mb L: 30/35 MS: 1 InsertByte- 00:07:49.631 [2024-11-30 00:06:15.046929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.631 [2024-11-30 00:06:15.046955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.047029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.047050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.047121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.047137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.632 #41 NEW cov: 11787 ft: 15081 corp: 30/820b lim: 35 exec/s: 41 rss: 69Mb L: 27/35 MS: 1 PersAutoDict- DE: ">\000"- 00:07:49.632 [2024-11-30 00:06:15.087301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.087327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.087403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.087423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.087495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.087511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.087582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.087602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.087693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.087709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.632 #42 NEW cov: 11787 ft: 15089 corp: 31/855b lim: 35 exec/s: 42 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:49.632 [2024-11-30 00:06:15.127321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.127348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.127423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.127443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.127516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.127533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.127603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.127620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.632 #43 NEW cov: 11787 ft: 15093 corp: 32/889b lim: 35 exec/s: 43 rss: 69Mb L: 34/35 MS: 1 CrossOver- 00:07:49.632 [2024-11-30 00:06:15.167126] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.167152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.632 [2024-11-30 00:06:15.167226] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.632 [2024-11-30 00:06:15.167245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.890 #44 NEW cov: 11787 ft: 15112 corp: 33/908b lim: 35 exec/s: 22 rss: 69Mb L: 19/35 MS: 1 InsertByte- 00:07:49.890 #44 DONE cov: 11787 ft: 15112 corp: 33/908b lim: 35 exec/s: 22 rss: 69Mb 00:07:49.890 ###### Recommended dictionary. ###### 00:07:49.890 ">\000" # Uses: 3 00:07:49.890 ###### End of recommended dictionary. ###### 00:07:49.890 Done 44 runs in 2 second(s) 00:07:49.890 00:06:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:49.890 00:06:15 -- ../common.sh@72 -- # (( i++ )) 00:07:49.890 00:06:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.890 00:06:15 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:49.890 00:06:15 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:49.890 00:06:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.890 00:06:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.890 00:06:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:49.890 00:06:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:49.890 00:06:15 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:49.890 00:06:15 -- nvmf/run.sh@29 -- # port=4416 00:07:49.890 00:06:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:49.890 00:06:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:49.890 00:06:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.890 00:06:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:49.890 [2024-11-30 00:06:15.351649] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:49.890 [2024-11-30 00:06:15.351720] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2728310 ] 00:07:49.890 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.149 [2024-11-30 00:06:15.528994] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.149 [2024-11-30 00:06:15.592740] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.149 [2024-11-30 00:06:15.592871] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.149 [2024-11-30 00:06:15.651748] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.149 [2024-11-30 00:06:15.668126] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:50.149 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.149 INFO: Seed: 3730410759 00:07:50.408 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:50.408 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:50.408 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:50.408 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.408 #2 INITED exec/s: 0 rss: 61Mb 00:07:50.408 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.408 This may also happen if the target rejected all inputs we tried so far 00:07:50.408 [2024-11-30 00:06:15.744405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.408 [2024-11-30 00:06:15.744443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.408 [2024-11-30 00:06:15.744567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.408 [2024-11-30 00:06:15.744592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.408 [2024-11-30 00:06:15.744727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.408 [2024-11-30 00:06:15.744751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.408 [2024-11-30 00:06:15.744873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.408 [2024-11-30 00:06:15.744892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.667 NEW_FUNC[1/671]: 0x451308 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:50.667 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.667 #19 NEW cov: 11663 ft: 11664 corp: 2/93b lim: 105 exec/s: 0 rss: 68Mb L: 92/92 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:50.667 [2024-11-30 00:06:16.075302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.667 [2024-11-30 00:06:16.075344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.667 [2024-11-30 00:06:16.075470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.667 [2024-11-30 00:06:16.075494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.667 [2024-11-30 00:06:16.075624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.667 [2024-11-30 00:06:16.075649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.667 [2024-11-30 00:06:16.075768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.075792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.668 #20 NEW cov: 11776 ft: 12255 corp: 3/186b lim: 105 exec/s: 0 rss: 68Mb L: 93/93 MS: 1 InsertByte- 00:07:50.668 [2024-11-30 00:06:16.124710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9223372599495491584 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.124741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.668 #23 NEW cov: 11782 ft: 13202 corp: 4/216b lim: 105 exec/s: 0 rss: 68Mb L: 30/93 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:50.668 [2024-11-30 00:06:16.165433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.165465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.668 [2024-11-30 00:06:16.165548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.165568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.668 [2024-11-30 00:06:16.165691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.165710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.668 [2024-11-30 00:06:16.165829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.165848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.668 #24 NEW cov: 11867 ft: 13572 corp: 5/308b lim: 105 exec/s: 0 rss: 68Mb L: 92/93 MS: 1 ChangeBinInt- 00:07:50.668 [2024-11-30 00:06:16.205589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.205636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.668 [2024-11-30 00:06:16.205728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.205751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.668 [2024-11-30 00:06:16.205869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.205891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.668 [2024-11-30 00:06:16.206009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.668 [2024-11-30 00:06:16.206031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.927 #25 NEW cov: 11867 ft: 13647 corp: 6/409b lim: 105 exec/s: 0 rss: 68Mb L: 101/101 MS: 1 CopyPart- 00:07:50.927 [2024-11-30 00:06:16.245669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.245700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.245793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.245816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.245935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.245959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.246077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.246100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.927 #26 NEW cov: 11867 ft: 13726 corp: 7/513b lim: 105 exec/s: 0 rss: 69Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:07:50.927 [2024-11-30 00:06:16.285637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8897841257187605371 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.285666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.285761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8897841259083430779 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.285783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.285901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8897841259083430779 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.285922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.927 #32 NEW cov: 11867 ft: 14122 corp: 8/592b lim: 105 exec/s: 0 rss: 69Mb L: 79/104 MS: 1 InsertRepeatedBytes- 00:07:50.927 [2024-11-30 00:06:16.325917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.325947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.326035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.326056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.326173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.326195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.326315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4294901760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.326334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.927 #33 NEW cov: 11867 ft: 14220 corp: 9/690b lim: 105 exec/s: 0 rss: 69Mb L: 98/104 MS: 1 InsertRepeatedBytes- 00:07:50.927 [2024-11-30 00:06:16.366068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.366100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.366167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.366189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.366311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.366331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.366447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.366467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.927 #34 NEW cov: 11867 ft: 14265 corp: 10/782b lim: 105 exec/s: 0 rss: 69Mb L: 92/104 MS: 1 ShuffleBytes- 00:07:50.927 [2024-11-30 00:06:16.406152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.406183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.406265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.406286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.927 [2024-11-30 00:06:16.406401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.927 [2024-11-30 00:06:16.406422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.928 [2024-11-30 00:06:16.406537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:13792273858822144 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.928 [2024-11-30 00:06:16.406556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.928 #35 NEW cov: 11867 ft: 14285 corp: 11/875b lim: 105 exec/s: 0 rss: 69Mb L: 93/104 MS: 1 InsertByte- 00:07:50.928 [2024-11-30 00:06:16.445912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:101155069755392 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.928 [2024-11-30 00:06:16.445943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.928 [2024-11-30 00:06:16.446058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.928 [2024-11-30 00:06:16.446080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.928 #36 NEW cov: 11867 ft: 14581 corp: 12/936b lim: 105 exec/s: 0 rss: 69Mb L: 61/104 MS: 1 EraseBytes- 00:07:51.187 [2024-11-30 00:06:16.486335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.187 [2024-11-30 00:06:16.486368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.187 [2024-11-30 00:06:16.486475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.187 [2024-11-30 00:06:16.486497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.187 [2024-11-30 00:06:16.486619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8897841259083430779 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.187 [2024-11-30 00:06:16.486641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.187 #37 NEW cov: 11867 ft: 14596 corp: 13/1015b lim: 105 exec/s: 0 rss: 69Mb L: 79/104 MS: 1 CrossOver- 00:07:51.187 [2024-11-30 00:06:16.526624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.187 [2024-11-30 00:06:16.526656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.187 [2024-11-30 00:06:16.526741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.187 [2024-11-30 00:06:16.526762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.187 [2024-11-30 00:06:16.526872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.187 [2024-11-30 00:06:16.526894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.187 [2024-11-30 00:06:16.527017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.527036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.188 #38 NEW cov: 11867 ft: 14630 corp: 14/1112b lim: 105 exec/s: 0 rss: 69Mb L: 97/104 MS: 1 CrossOver- 00:07:51.188 [2024-11-30 00:06:16.566625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.566655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.566748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.566767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.566883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.566908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.567027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.567049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.188 #39 NEW cov: 11867 ft: 14691 corp: 15/1216b lim: 105 exec/s: 0 rss: 69Mb L: 104/104 MS: 1 ChangeBinInt- 00:07:51.188 [2024-11-30 00:06:16.606698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8897841257187605371 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.606727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.606830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8897841257808362363 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.606851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.606970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8897841259083430779 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.606992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.188 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.188 #40 NEW cov: 11890 ft: 14769 corp: 16/1295b lim: 105 exec/s: 0 rss: 69Mb L: 79/104 MS: 1 ChangeByte- 00:07:51.188 [2024-11-30 00:06:16.646744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.646777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.646898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.646918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.647042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.647062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.188 #41 NEW cov: 11890 ft: 14782 corp: 17/1368b lim: 105 exec/s: 0 rss: 69Mb L: 73/104 MS: 1 EraseBytes- 00:07:51.188 [2024-11-30 00:06:16.686997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.687031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.687146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.687168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.687284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:208 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.687303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.687418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.687441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.188 #42 NEW cov: 11890 ft: 14797 corp: 18/1472b lim: 105 exec/s: 0 rss: 69Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:07:51.188 [2024-11-30 00:06:16.727163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.727197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.727271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.727291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.727409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:208 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.727431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.188 [2024-11-30 00:06:16.727555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.188 [2024-11-30 00:06:16.727582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.448 #43 NEW cov: 11890 ft: 14807 corp: 19/1576b lim: 105 exec/s: 43 rss: 69Mb L: 104/104 MS: 1 ChangeBit- 00:07:51.448 [2024-11-30 00:06:16.767489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.767524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.448 [2024-11-30 00:06:16.767606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.767627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.448 [2024-11-30 00:06:16.767724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:31489 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.767748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.448 [2024-11-30 00:06:16.767854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.767876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.448 [2024-11-30 00:06:16.767980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.767999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.448 #44 NEW cov: 11890 ft: 14842 corp: 20/1681b lim: 105 exec/s: 44 rss: 69Mb L: 105/105 MS: 1 CrossOver- 00:07:51.448 [2024-11-30 00:06:16.817439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.817468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.448 [2024-11-30 00:06:16.817563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.448 [2024-11-30 00:06:16.817582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.448 [2024-11-30 00:06:16.817690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:208 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.817710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.817822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.817844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.449 #45 NEW cov: 11890 ft: 14856 corp: 21/1765b lim: 105 exec/s: 45 rss: 69Mb L: 84/105 MS: 1 CrossOver- 00:07:51.449 [2024-11-30 00:06:16.857550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.857580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.857666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.857687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.857793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:64 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.857815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.857929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4294901760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.857949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.449 #46 NEW cov: 11890 ft: 14881 corp: 22/1863b lim: 105 exec/s: 46 rss: 69Mb L: 98/105 MS: 1 ChangeBit- 00:07:51.449 [2024-11-30 00:06:16.907614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.907646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.907726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.907747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.907848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:64 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.907869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.907978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4294901760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.907999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.449 #47 NEW cov: 11890 ft: 14889 corp: 23/1961b lim: 105 exec/s: 47 rss: 70Mb L: 98/105 MS: 1 ShuffleBytes- 00:07:51.449 [2024-11-30 00:06:16.957654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.957684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.957771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.957788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.957904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.957922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.958032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.958054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.958172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.958195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.449 #48 NEW cov: 11890 ft: 14915 corp: 24/2066b lim: 105 exec/s: 48 rss: 70Mb L: 105/105 MS: 1 CopyPart- 00:07:51.449 [2024-11-30 00:06:16.997868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.997896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.997973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.997993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.998028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.998052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.449 [2024-11-30 00:06:16.998171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4294901760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.449 [2024-11-30 00:06:16.998190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.708 #49 NEW cov: 11890 ft: 14928 corp: 25/2164b lim: 105 exec/s: 49 rss: 70Mb L: 98/105 MS: 1 ChangeByte- 00:07:51.708 [2024-11-30 00:06:17.038074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.038103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.038196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.038217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.038319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.038339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.038461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.038481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.708 #50 NEW cov: 11890 ft: 14936 corp: 26/2256b lim: 105 exec/s: 50 rss: 70Mb L: 92/105 MS: 1 EraseBytes- 00:07:51.708 [2024-11-30 00:06:17.087401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:101155069755392 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.087432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.087509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.087528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.708 #51 NEW cov: 11890 ft: 14945 corp: 27/2317b lim: 105 exec/s: 51 rss: 70Mb L: 61/105 MS: 1 ShuffleBytes- 00:07:51.708 [2024-11-30 00:06:17.138149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.138180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.138256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.138271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.138373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.138391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.138501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2071689984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.138521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.708 #52 NEW cov: 11890 ft: 14951 corp: 28/2413b lim: 105 exec/s: 52 rss: 70Mb L: 96/105 MS: 1 CrossOver- 00:07:51.708 [2024-11-30 00:06:17.178375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.178404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.178492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.178508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.178623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4611686018427387904 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.178645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.178758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.178778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.708 #53 NEW cov: 11890 ft: 14983 corp: 29/2512b lim: 105 exec/s: 53 rss: 70Mb L: 99/105 MS: 1 CopyPart- 00:07:51.708 [2024-11-30 00:06:17.228604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.228635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.228727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:558345748480 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.228748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.228844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.228865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.708 [2024-11-30 00:06:17.228983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.708 [2024-11-30 00:06:17.229005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.708 #54 NEW cov: 11890 ft: 15018 corp: 30/2605b lim: 105 exec/s: 54 rss: 70Mb L: 93/105 MS: 1 CrossOver- 00:07:51.969 [2024-11-30 00:06:17.269115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.269145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.269233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.269248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.269355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.269373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.269488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.269508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.269617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:5308416 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.269639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.969 #55 NEW cov: 11890 ft: 15030 corp: 31/2710b lim: 105 exec/s: 55 rss: 70Mb L: 105/105 MS: 1 InsertByte- 00:07:51.969 [2024-11-30 00:06:17.318092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:101155069755392 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.318122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.318179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.318199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.969 #56 NEW cov: 11890 ft: 15041 corp: 32/2771b lim: 105 exec/s: 56 rss: 70Mb L: 61/105 MS: 1 ChangeBinInt- 00:07:51.969 [2024-11-30 00:06:17.359125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.359155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.359230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:558345748480 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.359250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.359302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.359321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.359425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.359445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.359553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.359571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.969 #57 NEW cov: 11890 ft: 15049 corp: 33/2876b lim: 105 exec/s: 57 rss: 70Mb L: 105/105 MS: 1 CrossOver- 00:07:51.969 [2024-11-30 00:06:17.398768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.398797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.969 [2024-11-30 00:06:17.398879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.969 [2024-11-30 00:06:17.398893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.398988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4092851187 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.399008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.399115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.399133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.970 #58 NEW cov: 11890 ft: 15071 corp: 34/2961b lim: 105 exec/s: 58 rss: 70Mb L: 85/105 MS: 1 InsertRepeatedBytes- 00:07:51.970 [2024-11-30 00:06:17.438926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:101155069755392 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.438955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.439011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.439031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.970 #59 NEW cov: 11890 ft: 15081 corp: 35/3022b lim: 105 exec/s: 59 rss: 70Mb L: 61/105 MS: 1 ChangeBinInt- 00:07:51.970 [2024-11-30 00:06:17.478991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.479020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.479086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.479104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.479214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.479235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.970 #60 NEW cov: 11890 ft: 15108 corp: 36/3095b lim: 105 exec/s: 60 rss: 70Mb L: 73/105 MS: 1 CopyPart- 00:07:51.970 [2024-11-30 00:06:17.519400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.519428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.519523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.519543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.519650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:208 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.519672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.970 [2024-11-30 00:06:17.519782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:16777216 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.970 [2024-11-30 00:06:17.519802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.240 #61 NEW cov: 11890 ft: 15141 corp: 37/3179b lim: 105 exec/s: 61 rss: 70Mb L: 84/105 MS: 1 ChangeBit- 00:07:52.240 [2024-11-30 00:06:17.559551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.559579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.559664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.559678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.559788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:144115188109541890 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.559810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.559928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744069414584575 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.559951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.240 #67 NEW cov: 11890 ft: 15174 corp: 38/3283b lim: 105 exec/s: 67 rss: 70Mb L: 104/105 MS: 1 InsertRepeatedBytes- 00:07:52.240 [2024-11-30 00:06:17.599649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.599678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.599755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.599775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.599838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.599858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.599967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.599986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.240 #68 NEW cov: 11890 ft: 15176 corp: 39/3376b lim: 105 exec/s: 68 rss: 70Mb L: 93/105 MS: 1 ChangeBinInt- 00:07:52.240 [2024-11-30 00:06:17.639525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8897841257187605371 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.639554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.639607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8897841257808362363 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.639628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.639737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8897841259083430779 len:31612 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.639759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.240 #69 NEW cov: 11890 ft: 15189 corp: 40/3455b lim: 105 exec/s: 69 rss: 70Mb L: 79/105 MS: 1 ChangeBinInt- 00:07:52.240 [2024-11-30 00:06:17.679502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.679531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.679569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.679584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.240 #70 NEW cov: 11890 ft: 15195 corp: 41/3513b lim: 105 exec/s: 70 rss: 70Mb L: 58/105 MS: 1 EraseBytes- 00:07:52.240 [2024-11-30 00:06:17.720169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.720202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.720257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.720279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.720382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.720404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.720515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.720536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.240 [2024-11-30 00:06:17.720650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.240 [2024-11-30 00:06:17.720671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:52.240 #71 NEW cov: 11890 ft: 15198 corp: 42/3618b lim: 105 exec/s: 35 rss: 70Mb L: 105/105 MS: 1 ChangeByte- 00:07:52.240 #71 DONE cov: 11890 ft: 15198 corp: 42/3618b lim: 105 exec/s: 35 rss: 70Mb 00:07:52.240 Done 71 runs in 2 second(s) 00:07:52.507 00:06:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:52.507 00:06:17 -- ../common.sh@72 -- # (( i++ )) 00:07:52.507 00:06:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.507 00:06:17 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:52.507 00:06:17 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:52.507 00:06:17 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.507 00:06:17 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.507 00:06:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:52.507 00:06:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:52.507 00:06:17 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:52.507 00:06:17 -- nvmf/run.sh@29 -- # port=4417 00:07:52.507 00:06:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:52.507 00:06:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:52.507 00:06:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.507 00:06:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:52.507 [2024-11-30 00:06:17.906961] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:52.507 [2024-11-30 00:06:17.907050] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2728847 ] 00:07:52.507 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.766 [2024-11-30 00:06:18.084629] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.766 [2024-11-30 00:06:18.147958] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.766 [2024-11-30 00:06:18.148108] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.766 [2024-11-30 00:06:18.206092] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.766 [2024-11-30 00:06:18.222437] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:52.766 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.766 INFO: Seed: 1990407540 00:07:52.766 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:52.766 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:52.766 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:52.766 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.766 #2 INITED exec/s: 0 rss: 60Mb 00:07:52.766 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.766 This may also happen if the target rejected all inputs we tried so far 00:07:52.766 [2024-11-30 00:06:18.288710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.766 [2024-11-30 00:06:18.288745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.766 [2024-11-30 00:06:18.288842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.767 [2024-11-30 00:06:18.288862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.767 [2024-11-30 00:06:18.288967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.767 [2024-11-30 00:06:18.288987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.031 NEW_FUNC[1/671]: 0x4545f8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:53.031 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.290 #8 NEW cov: 11661 ft: 11685 corp: 2/87b lim: 120 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:07:53.291 [2024-11-30 00:06:18.619511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.619560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.619639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.619661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.619761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.619782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.291 NEW_FUNC[1/1]: 0x1947fa8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:07:53.291 #9 NEW cov: 11797 ft: 12233 corp: 3/174b lim: 120 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 InsertByte- 00:07:53.291 [2024-11-30 00:06:18.669556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.669588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.669674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.669696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.669802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.669827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.291 #10 NEW cov: 11803 ft: 12505 corp: 4/261b lim: 120 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 CrossOver- 00:07:53.291 [2024-11-30 00:06:18.709756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.709786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.709818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.709834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.709934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.709956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.291 #11 NEW cov: 11888 ft: 12764 corp: 5/352b lim: 120 exec/s: 0 rss: 68Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:07:53.291 [2024-11-30 00:06:18.750027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.750057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.750124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.750143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.750248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.750271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.750374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16927600444109941482 len:60139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.750394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.291 #12 NEW cov: 11888 ft: 13216 corp: 6/469b lim: 120 exec/s: 0 rss: 68Mb L: 117/117 MS: 1 InsertRepeatedBytes- 00:07:53.291 [2024-11-30 00:06:18.800190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.800221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.800310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.800330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.800441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.800462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.800573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16927600444109941482 len:60139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.800593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.291 #13 NEW cov: 11888 ft: 13333 corp: 7/586b lim: 120 exec/s: 0 rss: 68Mb L: 117/117 MS: 1 ChangeByte- 00:07:53.291 [2024-11-30 00:06:18.840091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.840121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.840169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.840188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.291 [2024-11-30 00:06:18.840295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.291 [2024-11-30 00:06:18.840313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.551 #14 NEW cov: 11888 ft: 13392 corp: 8/664b lim: 120 exec/s: 0 rss: 68Mb L: 78/117 MS: 1 EraseBytes- 00:07:53.551 [2024-11-30 00:06:18.880119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.551 [2024-11-30 00:06:18.880150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.551 [2024-11-30 00:06:18.880202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.551 [2024-11-30 00:06:18.880221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.551 [2024-11-30 00:06:18.880319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.551 [2024-11-30 00:06:18.880341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.551 #20 NEW cov: 11888 ft: 13438 corp: 9/752b lim: 120 exec/s: 0 rss: 68Mb L: 88/117 MS: 1 CrossOver- 00:07:53.551 [2024-11-30 00:06:18.920252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.551 [2024-11-30 00:06:18.920283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.551 [2024-11-30 00:06:18.920321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:18.920339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:18.920432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:18.920451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.552 #21 NEW cov: 11888 ft: 13467 corp: 10/838b lim: 120 exec/s: 0 rss: 69Mb L: 86/117 MS: 1 ChangeByte- 00:07:53.552 [2024-11-30 00:06:18.959938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743253555347455 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:18.959970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.552 #24 NEW cov: 11888 ft: 14361 corp: 11/868b lim: 120 exec/s: 0 rss: 69Mb L: 30/117 MS: 3 InsertRepeatedBytes-InsertByte-CopyPart- 00:07:53.552 [2024-11-30 00:06:19.000475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.000509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:19.000585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.000605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:19.000713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.000733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.552 #25 NEW cov: 11888 ft: 14452 corp: 12/959b lim: 120 exec/s: 0 rss: 69Mb L: 91/117 MS: 1 ChangeBinInt- 00:07:53.552 [2024-11-30 00:06:19.050672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.050702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:19.050771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.050791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:19.050902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.050924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.552 #26 NEW cov: 11888 ft: 14472 corp: 13/1037b lim: 120 exec/s: 0 rss: 69Mb L: 78/117 MS: 1 ChangeBit- 00:07:53.552 [2024-11-30 00:06:19.100819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.100850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:19.100895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.100915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.552 [2024-11-30 00:06:19.101028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.552 [2024-11-30 00:06:19.101048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.812 #27 NEW cov: 11888 ft: 14509 corp: 14/1124b lim: 120 exec/s: 0 rss: 69Mb L: 87/117 MS: 1 ChangeBit- 00:07:53.812 [2024-11-30 00:06:19.140890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.140922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.141013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.141037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.141149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.141171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.812 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.812 #28 NEW cov: 11911 ft: 14566 corp: 15/1212b lim: 120 exec/s: 0 rss: 69Mb L: 88/117 MS: 1 ChangeBinInt- 00:07:53.812 [2024-11-30 00:06:19.191094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.191123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.191201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.191221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.191338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.191359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.812 #29 NEW cov: 11911 ft: 14642 corp: 16/1307b lim: 120 exec/s: 0 rss: 69Mb L: 95/117 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:07:53.812 [2024-11-30 00:06:19.230935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:11140386615177419418 len:39579 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.230965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.231028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:11140386617063807642 len:39579 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.231049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.812 #33 NEW cov: 11911 ft: 14960 corp: 17/1369b lim: 120 exec/s: 0 rss: 69Mb L: 62/117 MS: 4 ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:53.812 [2024-11-30 00:06:19.271230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.271261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.271331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.812 [2024-11-30 00:06:19.271353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.812 [2024-11-30 00:06:19.271464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.271486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.813 #34 NEW cov: 11911 ft: 14976 corp: 18/1457b lim: 120 exec/s: 34 rss: 69Mb L: 88/117 MS: 1 CrossOver- 00:07:53.813 [2024-11-30 00:06:19.311687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.311717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.813 [2024-11-30 00:06:19.311769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.311788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.813 [2024-11-30 00:06:19.311897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.311920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.813 [2024-11-30 00:06:19.312030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.312052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.813 #35 NEW cov: 11911 ft: 14999 corp: 19/1565b lim: 120 exec/s: 35 rss: 69Mb L: 108/117 MS: 1 CrossOver- 00:07:53.813 [2024-11-30 00:06:19.351618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.351648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.813 [2024-11-30 00:06:19.351689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.351707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.813 [2024-11-30 00:06:19.351814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.813 [2024-11-30 00:06:19.351832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.072 #36 NEW cov: 11911 ft: 15040 corp: 20/1651b lim: 120 exec/s: 36 rss: 69Mb L: 86/117 MS: 1 ChangeBit- 00:07:54.072 [2024-11-30 00:06:19.391103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167903232 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.391129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.072 #37 NEW cov: 11911 ft: 15100 corp: 21/1689b lim: 120 exec/s: 37 rss: 69Mb L: 38/117 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:54.072 [2024-11-30 00:06:19.431621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:11140386615177419418 len:39579 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.431649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.072 [2024-11-30 00:06:19.431740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:11140386479624854170 len:39579 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.431760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.072 #38 NEW cov: 11911 ft: 15111 corp: 22/1751b lim: 120 exec/s: 38 rss: 69Mb L: 62/117 MS: 1 ChangeByte- 00:07:54.072 [2024-11-30 00:06:19.471951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.471981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.072 [2024-11-30 00:06:19.472016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.472031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.072 [2024-11-30 00:06:19.472135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.472157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.072 #39 NEW cov: 11911 ft: 15135 corp: 23/1846b lim: 120 exec/s: 39 rss: 69Mb L: 95/117 MS: 1 CMP- DE: "\002\000\000\000"- 00:07:54.072 [2024-11-30 00:06:19.511459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743253555347455 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.511485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.072 #40 NEW cov: 11911 ft: 15199 corp: 24/1876b lim: 120 exec/s: 40 rss: 69Mb L: 30/117 MS: 1 ShuffleBytes- 00:07:54.072 [2024-11-30 00:06:19.552200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.552229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.072 [2024-11-30 00:06:19.552310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.552333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.072 [2024-11-30 00:06:19.552438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.552461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.072 #41 NEW cov: 11911 ft: 15207 corp: 25/1971b lim: 120 exec/s: 41 rss: 69Mb L: 95/117 MS: 1 CMP- DE: "\202k\345\211\005G\224\000"- 00:07:54.072 [2024-11-30 00:06:19.592322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.592353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.072 [2024-11-30 00:06:19.592443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3026418949592973312 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.072 [2024-11-30 00:06:19.592465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.073 [2024-11-30 00:06:19.592584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-30 00:06:19.592606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.073 #42 NEW cov: 11911 ft: 15213 corp: 26/2060b lim: 120 exec/s: 42 rss: 70Mb L: 89/117 MS: 1 InsertByte- 00:07:54.332 [2024-11-30 00:06:19.632399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.632431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.332 [2024-11-30 00:06:19.632467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.632487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.332 [2024-11-30 00:06:19.632595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.632620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.332 #45 NEW cov: 11911 ft: 15226 corp: 27/2134b lim: 120 exec/s: 45 rss: 70Mb L: 74/117 MS: 3 ChangeByte-CrossOver-CrossOver- 00:07:54.332 [2024-11-30 00:06:19.672789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.672817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.332 [2024-11-30 00:06:19.672886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.672903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.332 [2024-11-30 00:06:19.672998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.673020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.332 [2024-11-30 00:06:19.673140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16927600444109941482 len:60139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-30 00:06:19.673161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.332 #46 NEW cov: 11911 ft: 15236 corp: 28/2252b lim: 120 exec/s: 46 rss: 70Mb L: 118/118 MS: 1 InsertByte- 00:07:54.332 [2024-11-30 00:06:19.712371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3590979584 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.712398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.712483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.712504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.752532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3590979584 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.752559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.752629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:137438953472 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.752646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.333 #50 NEW cov: 11911 ft: 15246 corp: 29/2306b lim: 120 exec/s: 50 rss: 70Mb L: 54/118 MS: 4 CopyPart-ChangeByte-CrossOver-ChangeBit- 00:07:54.333 [2024-11-30 00:06:19.792727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.792765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.792844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.792866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.792986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.793008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.333 #51 NEW cov: 11911 ft: 15256 corp: 30/2395b lim: 120 exec/s: 51 rss: 70Mb L: 89/118 MS: 1 InsertByte- 00:07:54.333 [2024-11-30 00:06:19.832774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:184486402 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.832801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.832847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073697099775 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.832871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.333 #52 NEW cov: 11911 ft: 15262 corp: 31/2450b lim: 120 exec/s: 52 rss: 70Mb L: 55/118 MS: 1 CrossOver- 00:07:54.333 [2024-11-30 00:06:19.872926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:11140386615177419418 len:39579 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.872953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.333 [2024-11-30 00:06:19.873014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9873376461652650981 len:39579 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-30 00:06:19.873034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.637 #53 NEW cov: 11911 ft: 15298 corp: 32/2512b lim: 120 exec/s: 53 rss: 70Mb L: 62/118 MS: 1 PersAutoDict- DE: "\202k\345\211\005G\224\000"- 00:07:54.637 [2024-11-30 00:06:19.923548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.923578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.923614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.923633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.923739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.923759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.923870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16927600444109941482 len:60139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.923891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.637 #59 NEW cov: 11911 ft: 15319 corp: 33/2630b lim: 120 exec/s: 59 rss: 70Mb L: 118/118 MS: 1 InsertByte- 00:07:54.637 [2024-11-30 00:06:19.963854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.963884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.963919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.963936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.964035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.964058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.964166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.964186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:19.964300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:19.964323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:54.637 #60 NEW cov: 11911 ft: 15358 corp: 34/2750b lim: 120 exec/s: 60 rss: 70Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:07:54.637 [2024-11-30 00:06:20.003513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.003541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.003617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.003636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.003740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.003763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.003862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.003882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.637 #61 NEW cov: 11911 ft: 15368 corp: 35/2867b lim: 120 exec/s: 61 rss: 70Mb L: 117/120 MS: 1 InsertRepeatedBytes- 00:07:54.637 [2024-11-30 00:06:20.063713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.063745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.063805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.063824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.063924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12090332938240 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.063945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.637 #62 NEW cov: 11911 ft: 15406 corp: 36/2943b lim: 120 exec/s: 62 rss: 70Mb L: 76/120 MS: 1 CrossOver- 00:07:54.637 [2024-11-30 00:06:20.103765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10696049282777088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.103796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.103833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.103852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.637 [2024-11-30 00:06:20.103950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:13738793614438688338 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.103970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.637 #63 NEW cov: 11911 ft: 15428 corp: 37/3027b lim: 120 exec/s: 63 rss: 70Mb L: 84/120 MS: 1 CMP- DE: "\001\224G\005\346R\276\252"- 00:07:54.637 [2024-11-30 00:06:20.154316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.637 [2024-11-30 00:06:20.154353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.638 [2024-11-30 00:06:20.154434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.638 [2024-11-30 00:06:20.154452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.638 [2024-11-30 00:06:20.154559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.638 [2024-11-30 00:06:20.154579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.638 [2024-11-30 00:06:20.154665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.638 [2024-11-30 00:06:20.154686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.638 #64 NEW cov: 11911 ft: 15451 corp: 38/3132b lim: 120 exec/s: 64 rss: 70Mb L: 105/120 MS: 1 InsertRepeatedBytes- 00:07:55.007 [2024-11-30 00:06:20.204082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.007 [2024-11-30 00:06:20.204113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.007 [2024-11-30 00:06:20.204149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.007 [2024-11-30 00:06:20.204169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.007 [2024-11-30 00:06:20.204260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.007 [2024-11-30 00:06:20.204281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.007 #65 NEW cov: 11911 ft: 15464 corp: 39/3221b lim: 120 exec/s: 65 rss: 70Mb L: 89/120 MS: 1 EraseBytes- 00:07:55.007 [2024-11-30 00:06:20.253950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2738188573441261568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.007 [2024-11-30 00:06:20.253982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.007 [2024-11-30 00:06:20.254033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.007 [2024-11-30 00:06:20.254056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.007 [2024-11-30 00:06:20.254163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.007 [2024-11-30 00:06:20.254185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.007 #66 NEW cov: 11911 ft: 15475 corp: 40/3296b lim: 120 exec/s: 33 rss: 70Mb L: 75/120 MS: 1 CrossOver- 00:07:55.007 #66 DONE cov: 11911 ft: 15475 corp: 40/3296b lim: 120 exec/s: 33 rss: 70Mb 00:07:55.007 ###### Recommended dictionary. ###### 00:07:55.007 "\002\000\000\000\000\000\000\000" # Uses: 1 00:07:55.007 "\002\000\000\000" # Uses: 0 00:07:55.007 "\202k\345\211\005G\224\000" # Uses: 1 00:07:55.007 "\001\224G\005\346R\276\252" # Uses: 0 00:07:55.007 ###### End of recommended dictionary. ###### 00:07:55.007 Done 66 runs in 2 second(s) 00:07:55.007 00:06:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:55.007 00:06:20 -- ../common.sh@72 -- # (( i++ )) 00:07:55.007 00:06:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.007 00:06:20 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:55.007 00:06:20 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:55.007 00:06:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.007 00:06:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.007 00:06:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:55.008 00:06:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:55.008 00:06:20 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:55.008 00:06:20 -- nvmf/run.sh@29 -- # port=4418 00:07:55.008 00:06:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:55.008 00:06:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:55.008 00:06:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.008 00:06:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:55.008 [2024-11-30 00:06:20.438416] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:55.008 [2024-11-30 00:06:20.438483] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2729332 ] 00:07:55.008 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.266 [2024-11-30 00:06:20.618610] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.266 [2024-11-30 00:06:20.682317] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.266 [2024-11-30 00:06:20.682468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.266 [2024-11-30 00:06:20.740437] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.266 [2024-11-30 00:06:20.756801] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:55.266 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.266 INFO: Seed: 229456140 00:07:55.266 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:55.266 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:55.266 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:55.267 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.267 #2 INITED exec/s: 0 rss: 60Mb 00:07:55.267 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.267 This may also happen if the target rejected all inputs we tried so far 00:07:55.267 [2024-11-30 00:06:20.801850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.267 [2024-11-30 00:06:20.801878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.267 [2024-11-30 00:06:20.801917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.267 [2024-11-30 00:06:20.801930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.783 NEW_FUNC[1/670]: 0x457e58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:55.783 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.783 #3 NEW cov: 11628 ft: 11628 corp: 2/59b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:07:55.783 [2024-11-30 00:06:21.102577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.783 [2024-11-30 00:06:21.102613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.783 [2024-11-30 00:06:21.102651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.783 [2024-11-30 00:06:21.102664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.783 #4 NEW cov: 11741 ft: 12186 corp: 3/117b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 ChangeBit- 00:07:55.783 [2024-11-30 00:06:21.142641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.784 [2024-11-30 00:06:21.142667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.142702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.784 [2024-11-30 00:06:21.142715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.784 #5 NEW cov: 11747 ft: 12382 corp: 4/175b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 ShuffleBytes- 00:07:55.784 [2024-11-30 00:06:21.182719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.784 [2024-11-30 00:06:21.182744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.182772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.784 [2024-11-30 00:06:21.182785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.784 #6 NEW cov: 11832 ft: 12641 corp: 5/233b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 ChangeByte- 00:07:55.784 [2024-11-30 00:06:21.222832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.784 [2024-11-30 00:06:21.222856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.222898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.784 [2024-11-30 00:06:21.222912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.784 #7 NEW cov: 11832 ft: 12817 corp: 6/292b lim: 100 exec/s: 0 rss: 68Mb L: 59/59 MS: 1 InsertByte- 00:07:55.784 [2024-11-30 00:06:21.262937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.784 [2024-11-30 00:06:21.262962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.262989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.784 [2024-11-30 00:06:21.263001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.784 #8 NEW cov: 11832 ft: 12864 corp: 7/351b lim: 100 exec/s: 0 rss: 68Mb L: 59/59 MS: 1 ChangeByte- 00:07:55.784 [2024-11-30 00:06:21.303259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.784 [2024-11-30 00:06:21.303283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.303333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.784 [2024-11-30 00:06:21.303345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.303390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.784 [2024-11-30 00:06:21.303402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.784 [2024-11-30 00:06:21.303449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:55.784 [2024-11-30 00:06:21.303465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.784 #9 NEW cov: 11832 ft: 13254 corp: 8/431b lim: 100 exec/s: 0 rss: 68Mb L: 80/80 MS: 1 CrossOver- 00:07:56.042 [2024-11-30 00:06:21.343286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.042 [2024-11-30 00:06:21.343311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.042 [2024-11-30 00:06:21.343362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.042 [2024-11-30 00:06:21.343374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.042 [2024-11-30 00:06:21.343433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.042 [2024-11-30 00:06:21.343447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.042 #10 NEW cov: 11832 ft: 13520 corp: 9/510b lim: 100 exec/s: 0 rss: 68Mb L: 79/80 MS: 1 InsertRepeatedBytes- 00:07:56.042 [2024-11-30 00:06:21.383413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.042 [2024-11-30 00:06:21.383438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.042 [2024-11-30 00:06:21.383487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.042 [2024-11-30 00:06:21.383500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.042 [2024-11-30 00:06:21.383546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.042 [2024-11-30 00:06:21.383559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.042 #11 NEW cov: 11832 ft: 13569 corp: 10/589b lim: 100 exec/s: 0 rss: 68Mb L: 79/80 MS: 1 ShuffleBytes- 00:07:56.042 [2024-11-30 00:06:21.423423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.042 [2024-11-30 00:06:21.423448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.042 [2024-11-30 00:06:21.423475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.042 [2024-11-30 00:06:21.423489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.042 #12 NEW cov: 11832 ft: 13664 corp: 11/647b lim: 100 exec/s: 0 rss: 68Mb L: 58/80 MS: 1 ChangeBinInt- 00:07:56.042 [2024-11-30 00:06:21.463545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.042 [2024-11-30 00:06:21.463570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.042 [2024-11-30 00:06:21.463602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.042 [2024-11-30 00:06:21.463615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.042 #13 NEW cov: 11832 ft: 13680 corp: 12/698b lim: 100 exec/s: 0 rss: 69Mb L: 51/80 MS: 1 EraseBytes- 00:07:56.042 [2024-11-30 00:06:21.503654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.042 [2024-11-30 00:06:21.503679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.043 [2024-11-30 00:06:21.503715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.043 [2024-11-30 00:06:21.503729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.043 #14 NEW cov: 11832 ft: 13716 corp: 13/757b lim: 100 exec/s: 0 rss: 69Mb L: 59/80 MS: 1 CrossOver- 00:07:56.043 [2024-11-30 00:06:21.543898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.043 [2024-11-30 00:06:21.543923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.043 [2024-11-30 00:06:21.543973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.043 [2024-11-30 00:06:21.543985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.043 [2024-11-30 00:06:21.544031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.043 [2024-11-30 00:06:21.544043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.043 #15 NEW cov: 11832 ft: 13765 corp: 14/817b lim: 100 exec/s: 0 rss: 69Mb L: 60/80 MS: 1 InsertByte- 00:07:56.043 [2024-11-30 00:06:21.583864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.043 [2024-11-30 00:06:21.583891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.043 [2024-11-30 00:06:21.583926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.043 [2024-11-30 00:06:21.583939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 #16 NEW cov: 11832 ft: 13779 corp: 15/876b lim: 100 exec/s: 0 rss: 69Mb L: 59/80 MS: 1 ChangeBinInt- 00:07:56.302 [2024-11-30 00:06:21.623956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.623981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.624009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.302 [2024-11-30 00:06:21.624023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 #17 NEW cov: 11832 ft: 13910 corp: 16/935b lim: 100 exec/s: 0 rss: 69Mb L: 59/80 MS: 1 InsertByte- 00:07:56.302 [2024-11-30 00:06:21.664011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.664036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 #19 NEW cov: 11832 ft: 14244 corp: 17/964b lim: 100 exec/s: 0 rss: 69Mb L: 29/80 MS: 2 ShuffleBytes-CrossOver- 00:07:56.302 [2024-11-30 00:06:21.704236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.704261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.704289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.302 [2024-11-30 00:06:21.704302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 #20 NEW cov: 11832 ft: 14304 corp: 18/1022b lim: 100 exec/s: 0 rss: 69Mb L: 58/80 MS: 1 ChangeByte- 00:07:56.302 [2024-11-30 00:06:21.734534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.734559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.734608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.302 [2024-11-30 00:06:21.734619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.734667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.302 [2024-11-30 00:06:21.734680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.734726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.302 [2024-11-30 00:06:21.734739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.302 #21 NEW cov: 11832 ft: 14313 corp: 19/1113b lim: 100 exec/s: 0 rss: 69Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:07:56.302 [2024-11-30 00:06:21.774633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.774658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.774706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.302 [2024-11-30 00:06:21.774719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.774762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.302 [2024-11-30 00:06:21.774775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.774819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.302 [2024-11-30 00:06:21.774832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.302 #22 NEW cov: 11832 ft: 14338 corp: 20/1193b lim: 100 exec/s: 22 rss: 69Mb L: 80/91 MS: 1 CMP- DE: "\377\002"- 00:07:56.302 [2024-11-30 00:06:21.814758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.814783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.814832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.302 [2024-11-30 00:06:21.814844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.814889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.302 [2024-11-30 00:06:21.814902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.814949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.302 [2024-11-30 00:06:21.814961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.302 #23 NEW cov: 11832 ft: 14368 corp: 21/1280b lim: 100 exec/s: 23 rss: 69Mb L: 87/91 MS: 1 CrossOver- 00:07:56.302 [2024-11-30 00:06:21.854819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.302 [2024-11-30 00:06:21.854845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.854893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.302 [2024-11-30 00:06:21.854906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.302 [2024-11-30 00:06:21.854952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.302 [2024-11-30 00:06:21.854966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.560 #24 NEW cov: 11832 ft: 14372 corp: 22/1359b lim: 100 exec/s: 24 rss: 69Mb L: 79/91 MS: 1 ChangeBinInt- 00:07:56.560 [2024-11-30 00:06:21.894773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.560 [2024-11-30 00:06:21.894797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:21.894838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.560 [2024-11-30 00:06:21.894851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.560 #25 NEW cov: 11832 ft: 14404 corp: 23/1417b lim: 100 exec/s: 25 rss: 69Mb L: 58/91 MS: 1 PersAutoDict- DE: "\377\002"- 00:07:56.560 [2024-11-30 00:06:21.925108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.560 [2024-11-30 00:06:21.925133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:21.925181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.560 [2024-11-30 00:06:21.925193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:21.925236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.560 [2024-11-30 00:06:21.925250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:21.925294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.560 [2024-11-30 00:06:21.925307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.560 #26 NEW cov: 11832 ft: 14496 corp: 24/1500b lim: 100 exec/s: 26 rss: 69Mb L: 83/91 MS: 1 InsertRepeatedBytes- 00:07:56.560 [2024-11-30 00:06:21.965117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.560 [2024-11-30 00:06:21.965141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:21.965181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.560 [2024-11-30 00:06:21.965194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:21.965240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.560 [2024-11-30 00:06:21.965253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.560 #27 NEW cov: 11832 ft: 14508 corp: 25/1560b lim: 100 exec/s: 27 rss: 69Mb L: 60/91 MS: 1 CMP- DE: "\377\223G\006\361\027g\322"- 00:07:56.560 [2024-11-30 00:06:22.005105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.560 [2024-11-30 00:06:22.005130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:22.005155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.560 [2024-11-30 00:06:22.005169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.560 #28 NEW cov: 11832 ft: 14530 corp: 26/1619b lim: 100 exec/s: 28 rss: 69Mb L: 59/91 MS: 1 ChangeBit- 00:07:56.560 [2024-11-30 00:06:22.045415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.560 [2024-11-30 00:06:22.045440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:22.045495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.560 [2024-11-30 00:06:22.045506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:22.045552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.560 [2024-11-30 00:06:22.045566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:22.045628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.560 [2024-11-30 00:06:22.045642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.560 #29 NEW cov: 11832 ft: 14550 corp: 27/1702b lim: 100 exec/s: 29 rss: 69Mb L: 83/91 MS: 1 CMP- DE: "\001\000"- 00:07:56.560 [2024-11-30 00:06:22.085324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.560 [2024-11-30 00:06:22.085349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.560 [2024-11-30 00:06:22.085386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.560 [2024-11-30 00:06:22.085398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.560 #30 NEW cov: 11832 ft: 14562 corp: 28/1761b lim: 100 exec/s: 30 rss: 69Mb L: 59/91 MS: 1 ShuffleBytes- 00:07:56.818 [2024-11-30 00:06:22.125629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.818 [2024-11-30 00:06:22.125654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.818 [2024-11-30 00:06:22.125701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.818 [2024-11-30 00:06:22.125714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.818 [2024-11-30 00:06:22.125760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.818 [2024-11-30 00:06:22.125774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.818 [2024-11-30 00:06:22.125823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.818 [2024-11-30 00:06:22.125836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.818 #31 NEW cov: 11832 ft: 14583 corp: 29/1845b lim: 100 exec/s: 31 rss: 69Mb L: 84/91 MS: 1 InsertByte- 00:07:56.818 [2024-11-30 00:06:22.165787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.818 [2024-11-30 00:06:22.165812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.818 [2024-11-30 00:06:22.165863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.818 [2024-11-30 00:06:22.165874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.818 [2024-11-30 00:06:22.165919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.819 [2024-11-30 00:06:22.165932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.165977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.819 [2024-11-30 00:06:22.165990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.819 #32 NEW cov: 11832 ft: 14591 corp: 30/1932b lim: 100 exec/s: 32 rss: 69Mb L: 87/91 MS: 1 CopyPart- 00:07:56.819 [2024-11-30 00:06:22.205741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.819 [2024-11-30 00:06:22.205765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.205813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.819 [2024-11-30 00:06:22.205826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.819 #33 NEW cov: 11832 ft: 14633 corp: 31/1991b lim: 100 exec/s: 33 rss: 69Mb L: 59/91 MS: 1 ChangeBit- 00:07:56.819 [2024-11-30 00:06:22.246065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.819 [2024-11-30 00:06:22.246091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.246143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.819 [2024-11-30 00:06:22.246156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.246202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:56.819 [2024-11-30 00:06:22.246215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.246261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:56.819 [2024-11-30 00:06:22.246274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.819 #34 NEW cov: 11832 ft: 14644 corp: 32/2078b lim: 100 exec/s: 34 rss: 69Mb L: 87/91 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:56.819 [2024-11-30 00:06:22.285871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.819 [2024-11-30 00:06:22.285896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.819 #35 NEW cov: 11832 ft: 14663 corp: 33/2113b lim: 100 exec/s: 35 rss: 70Mb L: 35/91 MS: 1 EraseBytes- 00:07:56.819 [2024-11-30 00:06:22.326063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.819 [2024-11-30 00:06:22.326087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.326135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.819 [2024-11-30 00:06:22.326148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.819 #36 NEW cov: 11832 ft: 14666 corp: 34/2171b lim: 100 exec/s: 36 rss: 70Mb L: 58/91 MS: 1 ShuffleBytes- 00:07:56.819 [2024-11-30 00:06:22.356107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:56.819 [2024-11-30 00:06:22.356131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.819 [2024-11-30 00:06:22.356179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:56.819 [2024-11-30 00:06:22.356191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 #37 NEW cov: 11832 ft: 14681 corp: 35/2230b lim: 100 exec/s: 37 rss: 70Mb L: 59/91 MS: 1 ChangeByte- 00:07:57.077 [2024-11-30 00:06:22.396456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.077 [2024-11-30 00:06:22.396481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.396535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.077 [2024-11-30 00:06:22.396547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.396593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.077 [2024-11-30 00:06:22.396610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.396658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:57.077 [2024-11-30 00:06:22.396670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.077 #38 NEW cov: 11832 ft: 14696 corp: 36/2313b lim: 100 exec/s: 38 rss: 70Mb L: 83/91 MS: 1 ChangeByte- 00:07:57.077 [2024-11-30 00:06:22.436483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.077 [2024-11-30 00:06:22.436508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.436550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.077 [2024-11-30 00:06:22.436563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.436612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.077 [2024-11-30 00:06:22.436625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.077 #39 NEW cov: 11832 ft: 14729 corp: 37/2392b lim: 100 exec/s: 39 rss: 70Mb L: 79/91 MS: 1 CrossOver- 00:07:57.077 [2024-11-30 00:06:22.476722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.077 [2024-11-30 00:06:22.476747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.476796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.077 [2024-11-30 00:06:22.476808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.476854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.077 [2024-11-30 00:06:22.476867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.476929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:57.077 [2024-11-30 00:06:22.476942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.077 #40 NEW cov: 11832 ft: 14747 corp: 38/2475b lim: 100 exec/s: 40 rss: 70Mb L: 83/91 MS: 1 ChangeBit- 00:07:57.077 [2024-11-30 00:06:22.516618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.077 [2024-11-30 00:06:22.516643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.516694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.077 [2024-11-30 00:06:22.516706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 #41 NEW cov: 11832 ft: 14814 corp: 39/2534b lim: 100 exec/s: 41 rss: 70Mb L: 59/91 MS: 1 CMP- DE: "\007\000"- 00:07:57.077 [2024-11-30 00:06:22.556963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.077 [2024-11-30 00:06:22.556988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.557043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.077 [2024-11-30 00:06:22.557055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.557101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.077 [2024-11-30 00:06:22.557115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.557160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:57.077 [2024-11-30 00:06:22.557174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.077 #42 NEW cov: 11832 ft: 14815 corp: 40/2625b lim: 100 exec/s: 42 rss: 70Mb L: 91/91 MS: 1 PersAutoDict- DE: "\377\223G\006\361\027g\322"- 00:07:57.077 [2024-11-30 00:06:22.596841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.077 [2024-11-30 00:06:22.596865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.077 [2024-11-30 00:06:22.596914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.077 [2024-11-30 00:06:22.596927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.077 #43 NEW cov: 11832 ft: 14845 corp: 41/2683b lim: 100 exec/s: 43 rss: 70Mb L: 58/91 MS: 1 CopyPart- 00:07:57.336 [2024-11-30 00:06:22.637172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.336 [2024-11-30 00:06:22.637197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.637247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.336 [2024-11-30 00:06:22.637259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.637304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.336 [2024-11-30 00:06:22.637317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.637366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:57.336 [2024-11-30 00:06:22.637379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.336 #44 NEW cov: 11832 ft: 14865 corp: 42/2774b lim: 100 exec/s: 44 rss: 70Mb L: 91/91 MS: 1 CopyPart- 00:07:57.336 [2024-11-30 00:06:22.677031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.336 [2024-11-30 00:06:22.677055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.677100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.336 [2024-11-30 00:06:22.677114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.336 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.336 #45 NEW cov: 11855 ft: 14899 corp: 43/2833b lim: 100 exec/s: 45 rss: 70Mb L: 59/91 MS: 1 PersAutoDict- DE: "\377\223G\006\361\027g\322"- 00:07:57.336 [2024-11-30 00:06:22.717152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.336 [2024-11-30 00:06:22.717177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.717212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.336 [2024-11-30 00:06:22.717224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.336 #46 NEW cov: 11855 ft: 14928 corp: 44/2892b lim: 100 exec/s: 46 rss: 70Mb L: 59/91 MS: 1 ChangeBinInt- 00:07:57.336 [2024-11-30 00:06:22.757510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.336 [2024-11-30 00:06:22.757534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.757584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.336 [2024-11-30 00:06:22.757595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.757643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.336 [2024-11-30 00:06:22.757657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.757718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:57.336 [2024-11-30 00:06:22.757731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.797607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.336 [2024-11-30 00:06:22.797632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.797682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.336 [2024-11-30 00:06:22.797693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.797738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.336 [2024-11-30 00:06:22.797751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.336 [2024-11-30 00:06:22.797796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:57.336 [2024-11-30 00:06:22.797809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.336 #48 NEW cov: 11855 ft: 14937 corp: 45/2982b lim: 100 exec/s: 24 rss: 70Mb L: 90/91 MS: 2 InsertRepeatedBytes-ChangeByte- 00:07:57.336 #48 DONE cov: 11855 ft: 14937 corp: 45/2982b lim: 100 exec/s: 24 rss: 70Mb 00:07:57.336 ###### Recommended dictionary. ###### 00:07:57.336 "\377\002" # Uses: 1 00:07:57.336 "\377\223G\006\361\027g\322" # Uses: 2 00:07:57.336 "\001\000" # Uses: 1 00:07:57.336 "\007\000" # Uses: 0 00:07:57.336 ###### End of recommended dictionary. ###### 00:07:57.336 Done 48 runs in 2 second(s) 00:07:57.595 00:06:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:57.595 00:06:22 -- ../common.sh@72 -- # (( i++ )) 00:07:57.595 00:06:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.595 00:06:22 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:57.595 00:06:22 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:57.595 00:06:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.595 00:06:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.595 00:06:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:57.595 00:06:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:57.595 00:06:22 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:57.595 00:06:22 -- nvmf/run.sh@29 -- # port=4419 00:07:57.595 00:06:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:57.595 00:06:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:57.595 00:06:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.595 00:06:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:57.595 [2024-11-30 00:06:22.976002] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:57.595 [2024-11-30 00:06:22.976073] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2729688 ] 00:07:57.595 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.853 [2024-11-30 00:06:23.158553] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.853 [2024-11-30 00:06:23.223741] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.853 [2024-11-30 00:06:23.223897] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.853 [2024-11-30 00:06:23.282005] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.853 [2024-11-30 00:06:23.298375] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:57.853 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.853 INFO: Seed: 2771443944 00:07:57.853 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:57.853 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:57.853 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:57.853 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.853 #2 INITED exec/s: 0 rss: 60Mb 00:07:57.853 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.853 This may also happen if the target rejected all inputs we tried so far 00:07:57.853 [2024-11-30 00:06:23.364106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533480228126542 len:20047 00:07:57.853 [2024-11-30 00:06:23.364147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 NEW_FUNC[1/670]: 0x45ae18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:58.420 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.420 #5 NEW cov: 11589 ft: 11606 corp: 2/19b lim: 50 exec/s: 0 rss: 68Mb L: 18/18 MS: 3 InsertByte-InsertRepeatedBytes-InsertRepeatedBytes- 00:07:58.420 [2024-11-30 00:06:23.704911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533481113144910 len:19979 00:07:58.420 [2024-11-30 00:06:23.704954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 #13 NEW cov: 11719 ft: 12010 corp: 3/31b lim: 50 exec/s: 0 rss: 68Mb L: 12/18 MS: 3 CopyPart-InsertByte-CrossOver- 00:07:58.420 [2024-11-30 00:06:23.745288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6872316418191220575 len:24416 00:07:58.420 [2024-11-30 00:06:23.745320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.745358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6872316419617283935 len:24416 00:07:58.420 [2024-11-30 00:06:23.745378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.745478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6872316419617283935 len:24416 00:07:58.420 [2024-11-30 00:06:23.745502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.420 #14 NEW cov: 11725 ft: 12614 corp: 4/67b lim: 50 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:58.420 [2024-11-30 00:06:23.785162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533956969496398 len:48574 00:07:58.420 [2024-11-30 00:06:23.785192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.785265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13640926421449555389 len:20047 00:07:58.420 [2024-11-30 00:06:23.785288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.420 #15 NEW cov: 11810 ft: 13133 corp: 5/93b lim: 50 exec/s: 0 rss: 68Mb L: 26/36 MS: 1 InsertRepeatedBytes- 00:07:58.420 [2024-11-30 00:06:23.835252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533197645303374 len:11 00:07:58.420 [2024-11-30 00:06:23.835280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 #16 NEW cov: 11810 ft: 13327 corp: 6/105b lim: 50 exec/s: 0 rss: 68Mb L: 12/36 MS: 1 ChangeBinInt- 00:07:58.420 [2024-11-30 00:06:23.875671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:58.420 [2024-11-30 00:06:23.875702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.875731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:58.420 [2024-11-30 00:06:23.875747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.875846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:58.420 [2024-11-30 00:06:23.875866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.420 #17 NEW cov: 11810 ft: 13498 corp: 7/139b lim: 50 exec/s: 0 rss: 68Mb L: 34/36 MS: 1 InsertRepeatedBytes- 00:07:58.420 [2024-11-30 00:06:23.915473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533481113144910 len:1 00:07:58.420 [2024-11-30 00:06:23.915500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 #18 NEW cov: 11810 ft: 13596 corp: 8/151b lim: 50 exec/s: 0 rss: 68Mb L: 12/36 MS: 1 ChangeBinInt- 00:07:58.420 [2024-11-30 00:06:23.956051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642458993495330382 len:15164 00:07:58.420 [2024-11-30 00:06:23.956081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.956163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:58.420 [2024-11-30 00:06:23.956184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.420 [2024-11-30 00:06:23.956294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:07:58.421 [2024-11-30 00:06:23.956314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.421 [2024-11-30 00:06:23.956411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4268070197446523707 len:15164 00:07:58.421 [2024-11-30 00:06:23.956434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.681 #20 NEW cov: 11810 ft: 13841 corp: 9/200b lim: 50 exec/s: 0 rss: 68Mb L: 49/49 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:58.681 [2024-11-30 00:06:23.995932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4702111234474983745 len:16706 00:07:58.681 [2024-11-30 00:06:23.995961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:23.995999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4702111234474983745 len:16706 00:07:58.681 [2024-11-30 00:06:23.996017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:23.996105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4702111234474983745 len:16706 00:07:58.681 [2024-11-30 00:06:23.996123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.681 #22 NEW cov: 11810 ft: 13894 corp: 10/235b lim: 50 exec/s: 0 rss: 68Mb L: 35/49 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:58.681 [2024-11-30 00:06:24.036205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642458993495330382 len:15164 00:07:58.681 [2024-11-30 00:06:24.036234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.036312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:58.681 [2024-11-30 00:06:24.036330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.036424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:07:58.681 [2024-11-30 00:06:24.036442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.036551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4268070197446523707 len:15164 00:07:58.681 [2024-11-30 00:06:24.036573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.681 #23 NEW cov: 11810 ft: 13977 corp: 11/284b lim: 50 exec/s: 0 rss: 68Mb L: 49/49 MS: 1 ChangeBit- 00:07:58.681 [2024-11-30 00:06:24.076050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.076080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.076134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.076157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.681 #24 NEW cov: 11810 ft: 13995 corp: 12/310b lim: 50 exec/s: 0 rss: 68Mb L: 26/49 MS: 1 EraseBytes- 00:07:58.681 [2024-11-30 00:06:24.116483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.116510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.116589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.116611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.116690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.116710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.681 [2024-11-30 00:06:24.116813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.116836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.681 #25 NEW cov: 11810 ft: 14080 corp: 13/355b lim: 50 exec/s: 0 rss: 68Mb L: 45/49 MS: 1 CopyPart- 00:07:58.681 [2024-11-30 00:06:24.156188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533481113144910 len:65281 00:07:58.681 [2024-11-30 00:06:24.156215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.681 #26 NEW cov: 11810 ft: 14119 corp: 14/368b lim: 50 exec/s: 0 rss: 68Mb L: 13/49 MS: 1 InsertByte- 00:07:58.681 [2024-11-30 00:06:24.206368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:58.681 [2024-11-30 00:06:24.206394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.681 #27 NEW cov: 11810 ft: 14128 corp: 15/380b lim: 50 exec/s: 0 rss: 68Mb L: 12/49 MS: 1 CrossOver- 00:07:58.940 [2024-11-30 00:06:24.246417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533484339547710 len:19979 00:07:58.940 [2024-11-30 00:06:24.246445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.940 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.940 #28 NEW cov: 11833 ft: 14167 corp: 16/392b lim: 50 exec/s: 0 rss: 68Mb L: 12/49 MS: 1 ShuffleBytes- 00:07:58.940 [2024-11-30 00:06:24.286941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:58.940 [2024-11-30 00:06:24.286969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.287030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:58.940 [2024-11-30 00:06:24.287048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.287147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:35465847065542656 len:1 00:07:58.940 [2024-11-30 00:06:24.287168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.287282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:58.940 [2024-11-30 00:06:24.287304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.940 #29 NEW cov: 11833 ft: 14178 corp: 17/437b lim: 50 exec/s: 0 rss: 69Mb L: 45/49 MS: 1 ChangeByte- 00:07:58.940 [2024-11-30 00:06:24.326944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533949264580174 len:48060 00:07:58.940 [2024-11-30 00:06:24.326972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.327010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13527612320720337851 len:48060 00:07:58.940 [2024-11-30 00:06:24.327028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.327133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13527611852568902587 len:19979 00:07:58.940 [2024-11-30 00:06:24.327154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.940 #30 NEW cov: 11833 ft: 14213 corp: 18/469b lim: 50 exec/s: 30 rss: 69Mb L: 32/49 MS: 1 InsertRepeatedBytes- 00:07:58.940 [2024-11-30 00:06:24.367058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6872316418191220575 len:24416 00:07:58.940 [2024-11-30 00:06:24.367086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.367140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6872316419617283935 len:24416 00:07:58.940 [2024-11-30 00:06:24.367160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.367271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6872316419617283935 len:24416 00:07:58.940 [2024-11-30 00:06:24.367292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.940 #36 NEW cov: 11833 ft: 14295 corp: 19/505b lim: 50 exec/s: 36 rss: 69Mb L: 36/49 MS: 1 ShuffleBytes- 00:07:58.940 [2024-11-30 00:06:24.407026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:58.940 [2024-11-30 00:06:24.407054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.940 [2024-11-30 00:06:24.407109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1125899906842624 len:1 00:07:58.940 [2024-11-30 00:06:24.407126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.940 #37 NEW cov: 11833 ft: 14310 corp: 20/531b lim: 50 exec/s: 37 rss: 69Mb L: 26/49 MS: 1 ChangeBit- 00:07:58.940 [2024-11-30 00:06:24.447061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533197645303630 len:11 00:07:58.940 [2024-11-30 00:06:24.447087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.940 #38 NEW cov: 11833 ft: 14317 corp: 21/543b lim: 50 exec/s: 38 rss: 69Mb L: 12/49 MS: 1 ChangeBit- 00:07:58.940 [2024-11-30 00:06:24.487149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533198366723918 len:11 00:07:58.940 [2024-11-30 00:06:24.487175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 #39 NEW cov: 11833 ft: 14352 corp: 22/555b lim: 50 exec/s: 39 rss: 69Mb L: 12/49 MS: 1 ChangeByte- 00:07:59.201 [2024-11-30 00:06:24.527780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533949264580174 len:48060 00:07:59.201 [2024-11-30 00:06:24.527809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.527869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13527612320720337851 len:48060 00:07:59.201 [2024-11-30 00:06:24.527890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.527998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13527612320720337851 len:48060 00:07:59.201 [2024-11-30 00:06:24.528023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.528131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13527612320720337851 len:48060 00:07:59.201 [2024-11-30 00:06:24.528154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.201 #40 NEW cov: 11833 ft: 14363 corp: 23/601b lim: 50 exec/s: 40 rss: 69Mb L: 46/49 MS: 1 CopyPart- 00:07:59.201 [2024-11-30 00:06:24.567950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642458993495330382 len:15164 00:07:59.201 [2024-11-30 00:06:24.567980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.568061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:59.201 [2024-11-30 00:06:24.568082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.568180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:07:59.201 [2024-11-30 00:06:24.568202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.568307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4268070200298650427 len:15164 00:07:59.201 [2024-11-30 00:06:24.568330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.201 #41 NEW cov: 11833 ft: 14387 corp: 24/650b lim: 50 exec/s: 41 rss: 69Mb L: 49/49 MS: 1 ChangeByte- 00:07:59.201 [2024-11-30 00:06:24.607850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:59.201 [2024-11-30 00:06:24.607881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.607912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:52776558133248 len:1 00:07:59.201 [2024-11-30 00:06:24.607931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.608023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:59.201 [2024-11-30 00:06:24.608046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.201 #42 NEW cov: 11833 ft: 14401 corp: 25/685b lim: 50 exec/s: 42 rss: 69Mb L: 35/49 MS: 1 InsertByte- 00:07:59.201 [2024-11-30 00:06:24.648121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642458993495330382 len:15164 00:07:59.201 [2024-11-30 00:06:24.648150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.648217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:59.201 [2024-11-30 00:06:24.648236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.648341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:07:59.201 [2024-11-30 00:06:24.648362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.648474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:16672149204959744 len:15164 00:07:59.201 [2024-11-30 00:06:24.648495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.201 #43 NEW cov: 11833 ft: 14416 corp: 26/734b lim: 50 exec/s: 43 rss: 69Mb L: 49/49 MS: 1 CMP- DE: "\001\002\000\000"- 00:07:59.201 [2024-11-30 00:06:24.687829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533956969496398 len:48574 00:07:59.201 [2024-11-30 00:06:24.687860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.687898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13640856052705377725 len:20047 00:07:59.201 [2024-11-30 00:06:24.687912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.201 #44 NEW cov: 11833 ft: 14444 corp: 27/760b lim: 50 exec/s: 44 rss: 69Mb L: 26/49 MS: 1 ChangeBit- 00:07:59.201 [2024-11-30 00:06:24.728214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4702111234474983745 len:24898 00:07:59.201 [2024-11-30 00:06:24.728245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.728278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4702111234474983745 len:16706 00:07:59.201 [2024-11-30 00:06:24.728294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.201 [2024-11-30 00:06:24.728392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4702111234474983745 len:16706 00:07:59.201 [2024-11-30 00:06:24.728410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.201 #45 NEW cov: 11833 ft: 14458 corp: 28/795b lim: 50 exec/s: 45 rss: 69Mb L: 35/49 MS: 1 ChangeBit- 00:07:59.462 [2024-11-30 00:06:24.768164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6872316418191220575 len:24416 00:07:59.462 [2024-11-30 00:06:24.768196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.768271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6872316419617283935 len:24416 00:07:59.462 [2024-11-30 00:06:24.768295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 #46 NEW cov: 11833 ft: 14478 corp: 29/824b lim: 50 exec/s: 46 rss: 69Mb L: 29/49 MS: 1 EraseBytes- 00:07:59.462 [2024-11-30 00:06:24.808375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4702111234474983745 len:24898 00:07:59.462 [2024-11-30 00:06:24.808407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.808498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4702111234474983745 len:16760 00:07:59.462 [2024-11-30 00:06:24.808518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 #47 NEW cov: 11833 ft: 14491 corp: 30/844b lim: 50 exec/s: 47 rss: 70Mb L: 20/49 MS: 1 EraseBytes- 00:07:59.462 [2024-11-30 00:06:24.858360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533484351192862 len:1 00:07:59.462 [2024-11-30 00:06:24.858388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 #48 NEW cov: 11833 ft: 14506 corp: 31/856b lim: 50 exec/s: 48 rss: 70Mb L: 12/49 MS: 1 CMP- DE: "\377\377\377\036"- 00:07:59.462 [2024-11-30 00:06:24.898947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:59.462 [2024-11-30 00:06:24.898978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.899018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:59.462 [2024-11-30 00:06:24.899036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.899128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:59.462 [2024-11-30 00:06:24.899148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.899253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:59.462 [2024-11-30 00:06:24.899274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.462 #49 NEW cov: 11833 ft: 14513 corp: 32/901b lim: 50 exec/s: 49 rss: 70Mb L: 45/49 MS: 1 ChangeByte- 00:07:59.462 [2024-11-30 00:06:24.939176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6872316418191220575 len:24416 00:07:59.462 [2024-11-30 00:06:24.939209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.939292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:59.462 [2024-11-30 00:06:24.939308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.939404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6872316419010935611 len:24416 00:07:59.462 [2024-11-30 00:06:24.939427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.939532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:6872316419617283935 len:24416 00:07:59.462 [2024-11-30 00:06:24.939549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.939644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:6872316419617283935 len:24416 00:07:59.462 [2024-11-30 00:06:24.939665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:59.462 #50 NEW cov: 11833 ft: 14590 corp: 33/951b lim: 50 exec/s: 50 rss: 70Mb L: 50/50 MS: 1 CrossOver- 00:07:59.462 [2024-11-30 00:06:24.979145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642458993495330382 len:15164 00:07:59.462 [2024-11-30 00:06:24.979176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.979254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:59.462 [2024-11-30 00:06:24.979274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.979372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:07:59.462 [2024-11-30 00:06:24.979392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.462 [2024-11-30 00:06:24.979496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:16672149208775550 len:15164 00:07:59.462 [2024-11-30 00:06:24.979519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.462 #51 NEW cov: 11833 ft: 14608 corp: 34/1000b lim: 50 exec/s: 51 rss: 70Mb L: 49/50 MS: 1 CMP- DE: "~\000"- 00:07:59.720 [2024-11-30 00:06:25.019130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4539433047384084046 len:20047 00:07:59.720 [2024-11-30 00:06:25.019162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 [2024-11-30 00:06:25.019193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268211483874114363 len:48060 00:07:59.720 [2024-11-30 00:06:25.019212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.720 [2024-11-30 00:06:25.019303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13527611852568902587 len:19979 00:07:59.720 [2024-11-30 00:06:25.019324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.720 #52 NEW cov: 11833 ft: 14661 corp: 35/1032b lim: 50 exec/s: 52 rss: 70Mb L: 32/50 MS: 1 CrossOver- 00:07:59.720 [2024-11-30 00:06:25.059049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533484350144286 len:1 00:07:59.720 [2024-11-30 00:06:25.059077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 #53 NEW cov: 11833 ft: 14669 corp: 36/1044b lim: 50 exec/s: 53 rss: 70Mb L: 12/50 MS: 1 ChangeBit- 00:07:59.720 [2024-11-30 00:06:25.099690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1094778880 len:1 00:07:59.720 [2024-11-30 00:06:25.099722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 [2024-11-30 00:06:25.099794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4702111233380188225 len:16706 00:07:59.720 [2024-11-30 00:06:25.099810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.099909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4702111234474983745 len:16706 00:07:59.721 [2024-11-30 00:06:25.099930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.100033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4702111234474983745 len:16706 00:07:59.721 [2024-11-30 00:06:25.100056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.721 #54 NEW cov: 11833 ft: 14696 corp: 37/1090b lim: 50 exec/s: 54 rss: 70Mb L: 46/50 MS: 1 InsertRepeatedBytes- 00:07:59.721 [2024-11-30 00:06:25.139340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533956969496398 len:48573 00:07:59.721 [2024-11-30 00:06:25.139371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.139413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13640856052705377725 len:20047 00:07:59.721 [2024-11-30 00:06:25.139432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.721 #55 NEW cov: 11833 ft: 14701 corp: 38/1116b lim: 50 exec/s: 55 rss: 70Mb L: 26/50 MS: 1 ChangeBit- 00:07:59.721 [2024-11-30 00:06:25.179854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533949264580174 len:48060 00:07:59.721 [2024-11-30 00:06:25.179885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.179967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13527612320720337851 len:48060 00:07:59.721 [2024-11-30 00:06:25.179990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.180092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13527612320720337851 len:48060 00:07:59.721 [2024-11-30 00:06:25.180113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.180212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13527611852568902587 len:19979 00:07:59.721 [2024-11-30 00:06:25.180230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.721 #56 NEW cov: 11833 ft: 14707 corp: 39/1158b lim: 50 exec/s: 56 rss: 70Mb L: 42/50 MS: 1 EraseBytes- 00:07:59.721 [2024-11-30 00:06:25.229995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4702110955302109505 len:1 00:07:59.721 [2024-11-30 00:06:25.230025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.230077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18402805286961152 len:16706 00:07:59.721 [2024-11-30 00:06:25.230095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.230180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4702111234474983745 len:16706 00:07:59.721 [2024-11-30 00:06:25.230204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.230309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4702111234474983745 len:16706 00:07:59.721 [2024-11-30 00:06:25.230328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.721 #57 NEW cov: 11833 ft: 14717 corp: 40/1201b lim: 50 exec/s: 57 rss: 70Mb L: 43/50 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:59.721 [2024-11-30 00:06:25.269951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642728094671257166 len:20047 00:07:59.721 [2024-11-30 00:06:25.269981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.270010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268211483874114363 len:48060 00:07:59.721 [2024-11-30 00:06:25.270041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.721 [2024-11-30 00:06:25.270126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13527611852568902587 len:19979 00:07:59.721 [2024-11-30 00:06:25.270143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.980 #58 NEW cov: 11833 ft: 14723 corp: 41/1233b lim: 50 exec/s: 58 rss: 70Mb L: 32/50 MS: 1 ShuffleBytes- 00:07:59.980 [2024-11-30 00:06:25.310254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642533949264580174 len:48060 00:07:59.980 [2024-11-30 00:06:25.310283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.980 [2024-11-30 00:06:25.310356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13527612320720337851 len:48060 00:07:59.980 [2024-11-30 00:06:25.310372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.980 [2024-11-30 00:06:25.310474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13474770088242166715 len:1 00:07:59.980 [2024-11-30 00:06:25.310495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.980 [2024-11-30 00:06:25.310592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13527612317570743227 len:48060 00:07:59.980 [2024-11-30 00:06:25.310616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.980 #59 NEW cov: 11833 ft: 14726 corp: 42/1282b lim: 50 exec/s: 59 rss: 70Mb L: 49/50 MS: 1 InsertRepeatedBytes- 00:07:59.980 [2024-11-30 00:06:25.350340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5642458993495330382 len:15164 00:07:59.980 [2024-11-30 00:06:25.350369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.980 [2024-11-30 00:06:25.350431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:07:59.980 [2024-11-30 00:06:25.350450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.980 [2024-11-30 00:06:25.350545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:07:59.980 [2024-11-30 00:06:25.350565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.980 [2024-11-30 00:06:25.350660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:16672149208775550 len:15164 00:07:59.980 [2024-11-30 00:06:25.350678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.980 #60 NEW cov: 11833 ft: 14735 corp: 43/1331b lim: 50 exec/s: 30 rss: 70Mb L: 49/50 MS: 1 CopyPart- 00:07:59.980 #60 DONE cov: 11833 ft: 14735 corp: 43/1331b lim: 50 exec/s: 30 rss: 70Mb 00:07:59.980 ###### Recommended dictionary. ###### 00:07:59.980 "\001\002\000\000" # Uses: 0 00:07:59.980 "\377\377\377\036" # Uses: 0 00:07:59.980 "~\000" # Uses: 0 00:07:59.980 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:59.980 ###### End of recommended dictionary. ###### 00:07:59.980 Done 60 runs in 2 second(s) 00:07:59.980 00:06:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:59.980 00:06:25 -- ../common.sh@72 -- # (( i++ )) 00:07:59.980 00:06:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.980 00:06:25 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:59.980 00:06:25 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:59.980 00:06:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.980 00:06:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.980 00:06:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:59.980 00:06:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:59.980 00:06:25 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:59.980 00:06:25 -- nvmf/run.sh@29 -- # port=4420 00:07:59.980 00:06:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:59.980 00:06:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:59.980 00:06:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.980 00:06:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:00.240 [2024-11-30 00:06:25.537703] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:00.240 [2024-11-30 00:06:25.537779] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730232 ] 00:08:00.240 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.240 [2024-11-30 00:06:25.715379] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.240 [2024-11-30 00:06:25.778193] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.240 [2024-11-30 00:06:25.778346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.498 [2024-11-30 00:06:25.836745] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.498 [2024-11-30 00:06:25.853067] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:00.498 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.498 INFO: Seed: 1031471773 00:08:00.498 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:00.499 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:00.499 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:00.499 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.499 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.499 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.499 This may also happen if the target rejected all inputs we tried so far 00:08:00.499 [2024-11-30 00:06:25.897835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.499 [2024-11-30 00:06:25.897870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.499 [2024-11-30 00:06:25.897906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.499 [2024-11-30 00:06:25.897924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.499 [2024-11-30 00:06:25.897955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.499 [2024-11-30 00:06:25.897972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.499 [2024-11-30 00:06:25.898000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:00.499 [2024-11-30 00:06:25.898017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.758 NEW_FUNC[1/672]: 0x45c9d8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:00.758 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.758 #12 NEW cov: 11664 ft: 11665 corp: 2/73b lim: 90 exec/s: 0 rss: 68Mb L: 72/72 MS: 5 InsertByte-InsertByte-ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:00.758 [2024-11-30 00:06:26.218478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.758 [2024-11-30 00:06:26.218517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.758 [2024-11-30 00:06:26.218555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.758 [2024-11-30 00:06:26.218574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.758 #13 NEW cov: 11777 ft: 12627 corp: 3/112b lim: 90 exec/s: 0 rss: 68Mb L: 39/72 MS: 1 EraseBytes- 00:08:00.758 [2024-11-30 00:06:26.288649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.758 [2024-11-30 00:06:26.288680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.758 [2024-11-30 00:06:26.288717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.758 [2024-11-30 00:06:26.288734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.758 [2024-11-30 00:06:26.288762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.758 [2024-11-30 00:06:26.288777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.758 [2024-11-30 00:06:26.288804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:00.758 [2024-11-30 00:06:26.288819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.017 #14 NEW cov: 11783 ft: 12900 corp: 4/184b lim: 90 exec/s: 0 rss: 68Mb L: 72/72 MS: 1 ChangeByte- 00:08:01.017 [2024-11-30 00:06:26.348788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.017 [2024-11-30 00:06:26.348818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.348851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.017 [2024-11-30 00:06:26.348868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.348897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.017 [2024-11-30 00:06:26.348913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.348940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.017 [2024-11-30 00:06:26.348955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.017 #15 NEW cov: 11868 ft: 13176 corp: 5/270b lim: 90 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 CopyPart- 00:08:01.017 [2024-11-30 00:06:26.418942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.017 [2024-11-30 00:06:26.418971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.419003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.017 [2024-11-30 00:06:26.419021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.419049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.017 [2024-11-30 00:06:26.419064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.017 #16 NEW cov: 11868 ft: 13533 corp: 6/332b lim: 90 exec/s: 0 rss: 68Mb L: 62/86 MS: 1 InsertRepeatedBytes- 00:08:01.017 [2024-11-30 00:06:26.468961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.017 [2024-11-30 00:06:26.468991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.469024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.017 [2024-11-30 00:06:26.469057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.017 #17 NEW cov: 11868 ft: 13657 corp: 7/373b lim: 90 exec/s: 0 rss: 68Mb L: 41/86 MS: 1 CMP- DE: "\013\000"- 00:08:01.017 [2024-11-30 00:06:26.539327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.017 [2024-11-30 00:06:26.539363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.539397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.017 [2024-11-30 00:06:26.539415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.539444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.017 [2024-11-30 00:06:26.539460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.017 [2024-11-30 00:06:26.539488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.017 [2024-11-30 00:06:26.539504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.276 #18 NEW cov: 11868 ft: 13764 corp: 8/459b lim: 90 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeBit- 00:08:01.276 [2024-11-30 00:06:26.609474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.276 [2024-11-30 00:06:26.609504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.609536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.276 [2024-11-30 00:06:26.609553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.609582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.276 [2024-11-30 00:06:26.609604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.609633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.276 [2024-11-30 00:06:26.609648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.276 #19 NEW cov: 11868 ft: 13809 corp: 9/545b lim: 90 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeBinInt- 00:08:01.276 [2024-11-30 00:06:26.669649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.276 [2024-11-30 00:06:26.669678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.669711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.276 [2024-11-30 00:06:26.669728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.669756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.276 [2024-11-30 00:06:26.669771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.669798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.276 [2024-11-30 00:06:26.669813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.276 #25 NEW cov: 11868 ft: 13865 corp: 10/618b lim: 90 exec/s: 0 rss: 68Mb L: 73/86 MS: 1 InsertByte- 00:08:01.276 [2024-11-30 00:06:26.719807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.276 [2024-11-30 00:06:26.719837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.719889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.276 [2024-11-30 00:06:26.719907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.719936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.276 [2024-11-30 00:06:26.719952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.719980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.276 [2024-11-30 00:06:26.719996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.720025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:01.276 [2024-11-30 00:06:26.720041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:01.276 #26 NEW cov: 11868 ft: 13922 corp: 11/708b lim: 90 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 CopyPart- 00:08:01.276 [2024-11-30 00:06:26.789973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.276 [2024-11-30 00:06:26.790002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.276 [2024-11-30 00:06:26.790034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.277 [2024-11-30 00:06:26.790051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.277 [2024-11-30 00:06:26.790079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.277 [2024-11-30 00:06:26.790094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.277 [2024-11-30 00:06:26.790121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.277 [2024-11-30 00:06:26.790136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.277 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.277 #27 NEW cov: 11885 ft: 13965 corp: 12/782b lim: 90 exec/s: 0 rss: 69Mb L: 74/90 MS: 1 PersAutoDict- DE: "\013\000"- 00:08:01.535 [2024-11-30 00:06:26.839986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.535 [2024-11-30 00:06:26.840016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.535 [2024-11-30 00:06:26.840049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.535 [2024-11-30 00:06:26.840066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.535 #28 NEW cov: 11885 ft: 13978 corp: 13/831b lim: 90 exec/s: 0 rss: 69Mb L: 49/90 MS: 1 EraseBytes- 00:08:01.535 [2024-11-30 00:06:26.890188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.535 [2024-11-30 00:06:26.890218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.535 [2024-11-30 00:06:26.890250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.536 [2024-11-30 00:06:26.890267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.536 [2024-11-30 00:06:26.890294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.536 [2024-11-30 00:06:26.890314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.536 [2024-11-30 00:06:26.890341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:01.536 [2024-11-30 00:06:26.890356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.536 #29 NEW cov: 11885 ft: 14015 corp: 14/919b lim: 90 exec/s: 29 rss: 69Mb L: 88/90 MS: 1 CMP- DE: "\377\007"- 00:08:01.536 [2024-11-30 00:06:26.940263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.536 [2024-11-30 00:06:26.940292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.536 [2024-11-30 00:06:26.940323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.536 [2024-11-30 00:06:26.940340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.536 [2024-11-30 00:06:26.940369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.536 [2024-11-30 00:06:26.940384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.536 #30 NEW cov: 11885 ft: 14043 corp: 15/981b lim: 90 exec/s: 30 rss: 69Mb L: 62/90 MS: 1 ChangeBit- 00:08:01.536 [2024-11-30 00:06:27.010471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.536 [2024-11-30 00:06:27.010502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.536 [2024-11-30 00:06:27.010536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.536 [2024-11-30 00:06:27.010554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.536 #31 NEW cov: 11885 ft: 14074 corp: 16/1017b lim: 90 exec/s: 31 rss: 69Mb L: 36/90 MS: 1 EraseBytes- 00:08:01.536 [2024-11-30 00:06:27.060476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.536 [2024-11-30 00:06:27.060506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.795 #32 NEW cov: 11885 ft: 14935 corp: 17/1042b lim: 90 exec/s: 32 rss: 69Mb L: 25/90 MS: 1 CrossOver- 00:08:01.795 [2024-11-30 00:06:27.120704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.795 [2024-11-30 00:06:27.120733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.795 [2024-11-30 00:06:27.120768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.795 [2024-11-30 00:06:27.120785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.795 #33 NEW cov: 11885 ft: 15012 corp: 18/1086b lim: 90 exec/s: 33 rss: 69Mb L: 44/90 MS: 1 CMP- DE: "\000\224G\012T\215\323F"- 00:08:01.795 [2024-11-30 00:06:27.180963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.795 [2024-11-30 00:06:27.180991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.795 [2024-11-30 00:06:27.181023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.795 [2024-11-30 00:06:27.181040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.795 [2024-11-30 00:06:27.181068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.795 [2024-11-30 00:06:27.181086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.795 #34 NEW cov: 11885 ft: 15050 corp: 19/1145b lim: 90 exec/s: 34 rss: 69Mb L: 59/90 MS: 1 EraseBytes- 00:08:01.795 [2024-11-30 00:06:27.230953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.795 [2024-11-30 00:06:27.230984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.795 #35 NEW cov: 11885 ft: 15128 corp: 20/1170b lim: 90 exec/s: 35 rss: 69Mb L: 25/90 MS: 1 ChangeBit- 00:08:01.795 [2024-11-30 00:06:27.301269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:01.795 [2024-11-30 00:06:27.301299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.795 [2024-11-30 00:06:27.301331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:01.795 [2024-11-30 00:06:27.301364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.795 [2024-11-30 00:06:27.301394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:01.795 [2024-11-30 00:06:27.301410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.795 #36 NEW cov: 11885 ft: 15144 corp: 21/1230b lim: 90 exec/s: 36 rss: 69Mb L: 60/90 MS: 1 CopyPart- 00:08:02.054 [2024-11-30 00:06:27.361386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.054 [2024-11-30 00:06:27.361415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.054 [2024-11-30 00:06:27.361448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:02.054 [2024-11-30 00:06:27.361465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.054 [2024-11-30 00:06:27.361493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:02.054 [2024-11-30 00:06:27.361509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.054 #37 NEW cov: 11885 ft: 15161 corp: 22/1292b lim: 90 exec/s: 37 rss: 69Mb L: 62/90 MS: 1 ShuffleBytes- 00:08:02.054 [2024-11-30 00:06:27.421457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.055 [2024-11-30 00:06:27.421486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.055 #38 NEW cov: 11885 ft: 15177 corp: 23/1317b lim: 90 exec/s: 38 rss: 69Mb L: 25/90 MS: 1 ChangeBinInt- 00:08:02.055 [2024-11-30 00:06:27.491759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.055 [2024-11-30 00:06:27.491788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.055 [2024-11-30 00:06:27.491820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:02.055 [2024-11-30 00:06:27.491837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.055 [2024-11-30 00:06:27.491865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:02.055 [2024-11-30 00:06:27.491880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.055 #39 NEW cov: 11885 ft: 15219 corp: 24/1376b lim: 90 exec/s: 39 rss: 69Mb L: 59/90 MS: 1 InsertRepeatedBytes- 00:08:02.055 [2024-11-30 00:06:27.551806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.055 [2024-11-30 00:06:27.551843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.055 #40 NEW cov: 11885 ft: 15233 corp: 25/1401b lim: 90 exec/s: 40 rss: 69Mb L: 25/90 MS: 1 ChangeBit- 00:08:02.055 [2024-11-30 00:06:27.601908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.055 [2024-11-30 00:06:27.601939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.314 #41 NEW cov: 11885 ft: 15251 corp: 26/1434b lim: 90 exec/s: 41 rss: 69Mb L: 33/90 MS: 1 PersAutoDict- DE: "\000\224G\012T\215\323F"- 00:08:02.314 [2024-11-30 00:06:27.672280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.314 [2024-11-30 00:06:27.672310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.314 [2024-11-30 00:06:27.672342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:02.314 [2024-11-30 00:06:27.672359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.314 [2024-11-30 00:06:27.672387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:02.315 [2024-11-30 00:06:27.672402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.315 [2024-11-30 00:06:27.672429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:02.315 [2024-11-30 00:06:27.672444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.315 #42 NEW cov: 11885 ft: 15265 corp: 27/1509b lim: 90 exec/s: 42 rss: 69Mb L: 75/90 MS: 1 CrossOver- 00:08:02.315 [2024-11-30 00:06:27.732341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.315 [2024-11-30 00:06:27.732370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.315 [2024-11-30 00:06:27.732403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:02.315 [2024-11-30 00:06:27.732420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.315 #43 NEW cov: 11885 ft: 15268 corp: 28/1550b lim: 90 exec/s: 43 rss: 70Mb L: 41/90 MS: 1 PersAutoDict- DE: "\000\224G\012T\215\323F"- 00:08:02.315 [2024-11-30 00:06:27.792471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.315 [2024-11-30 00:06:27.792503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.315 #44 NEW cov: 11892 ft: 15291 corp: 29/1575b lim: 90 exec/s: 44 rss: 70Mb L: 25/90 MS: 1 ShuffleBytes- 00:08:02.315 [2024-11-30 00:06:27.842641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.315 [2024-11-30 00:06:27.842672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.315 [2024-11-30 00:06:27.842705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:02.315 [2024-11-30 00:06:27.842738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.575 #45 NEW cov: 11892 ft: 15303 corp: 30/1619b lim: 90 exec/s: 22 rss: 70Mb L: 44/90 MS: 1 PersAutoDict- DE: "\000\224G\012T\215\323F"- 00:08:02.575 #45 DONE cov: 11892 ft: 15303 corp: 30/1619b lim: 90 exec/s: 22 rss: 70Mb 00:08:02.575 ###### Recommended dictionary. ###### 00:08:02.575 "\013\000" # Uses: 1 00:08:02.575 "\377\007" # Uses: 0 00:08:02.575 "\000\224G\012T\215\323F" # Uses: 3 00:08:02.575 ###### End of recommended dictionary. ###### 00:08:02.575 Done 45 runs in 2 second(s) 00:08:02.575 00:06:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:02.575 00:06:28 -- ../common.sh@72 -- # (( i++ )) 00:08:02.575 00:06:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.575 00:06:28 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:02.575 00:06:28 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:02.575 00:06:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.575 00:06:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.575 00:06:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:02.575 00:06:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:02.575 00:06:28 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:02.575 00:06:28 -- nvmf/run.sh@29 -- # port=4421 00:08:02.575 00:06:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:02.575 00:06:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:02.575 00:06:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.575 00:06:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:02.575 [2024-11-30 00:06:28.060740] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:02.575 [2024-11-30 00:06:28.060835] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730648 ] 00:08:02.575 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.843 [2024-11-30 00:06:28.241024] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.843 [2024-11-30 00:06:28.307085] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.843 [2024-11-30 00:06:28.307218] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.843 [2024-11-30 00:06:28.365235] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.843 [2024-11-30 00:06:28.381605] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:02.843 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.843 INFO: Seed: 3557470869 00:08:03.108 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:03.108 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:03.108 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:03.108 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.108 #2 INITED exec/s: 0 rss: 61Mb 00:08:03.108 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.108 This may also happen if the target rejected all inputs we tried so far 00:08:03.108 [2024-11-30 00:06:28.430057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.108 [2024-11-30 00:06:28.430089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.108 [2024-11-30 00:06:28.430157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.108 [2024-11-30 00:06:28.430178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.367 NEW_FUNC[1/672]: 0x45fc08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:03.367 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.367 #18 NEW cov: 11637 ft: 11638 corp: 2/26b lim: 50 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:03.367 [2024-11-30 00:06:28.731068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.367 [2024-11-30 00:06:28.731101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.731167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.367 [2024-11-30 00:06:28.731188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.731254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.367 [2024-11-30 00:06:28.731274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.731340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.367 [2024-11-30 00:06:28.731358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.367 #24 NEW cov: 11752 ft: 12419 corp: 3/70b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:03.367 [2024-11-30 00:06:28.781172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.367 [2024-11-30 00:06:28.781201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.781258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.367 [2024-11-30 00:06:28.781278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.781347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.367 [2024-11-30 00:06:28.781368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.781431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.367 [2024-11-30 00:06:28.781450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.367 #25 NEW cov: 11758 ft: 12726 corp: 4/114b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 ShuffleBytes- 00:08:03.367 [2024-11-30 00:06:28.821261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.367 [2024-11-30 00:06:28.821290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.821348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.367 [2024-11-30 00:06:28.821368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.821434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.367 [2024-11-30 00:06:28.821453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.821518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.367 [2024-11-30 00:06:28.821536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.367 #26 NEW cov: 11843 ft: 13038 corp: 5/159b lim: 50 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 InsertByte- 00:08:03.367 [2024-11-30 00:06:28.861080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.367 [2024-11-30 00:06:28.861108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.861181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.367 [2024-11-30 00:06:28.861204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.367 #27 NEW cov: 11843 ft: 13220 corp: 6/188b lim: 50 exec/s: 0 rss: 69Mb L: 29/45 MS: 1 EraseBytes- 00:08:03.367 [2024-11-30 00:06:28.901170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.367 [2024-11-30 00:06:28.901198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.367 [2024-11-30 00:06:28.901270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.367 [2024-11-30 00:06:28.901292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.367 #28 NEW cov: 11843 ft: 13328 corp: 7/216b lim: 50 exec/s: 0 rss: 69Mb L: 28/45 MS: 1 InsertRepeatedBytes- 00:08:03.627 [2024-11-30 00:06:28.941300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.627 [2024-11-30 00:06:28.941328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.627 [2024-11-30 00:06:28.941398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.627 [2024-11-30 00:06:28.941422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.627 #29 NEW cov: 11843 ft: 13404 corp: 8/242b lim: 50 exec/s: 0 rss: 69Mb L: 26/45 MS: 1 InsertByte- 00:08:03.627 [2024-11-30 00:06:28.981729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.627 [2024-11-30 00:06:28.981758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.627 [2024-11-30 00:06:28.981819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.627 [2024-11-30 00:06:28.981841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.627 [2024-11-30 00:06:28.981906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.627 [2024-11-30 00:06:28.981930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.627 [2024-11-30 00:06:28.981996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.627 [2024-11-30 00:06:28.982014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.627 #30 NEW cov: 11843 ft: 13428 corp: 9/286b lim: 50 exec/s: 0 rss: 69Mb L: 44/45 MS: 1 ChangeBinInt- 00:08:03.628 [2024-11-30 00:06:29.021836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.628 [2024-11-30 00:06:29.021866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.021930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.628 [2024-11-30 00:06:29.021952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.022016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.628 [2024-11-30 00:06:29.022035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.022101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.628 [2024-11-30 00:06:29.022125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.628 #31 NEW cov: 11843 ft: 13500 corp: 10/330b lim: 50 exec/s: 0 rss: 69Mb L: 44/45 MS: 1 ShuffleBytes- 00:08:03.628 [2024-11-30 00:06:29.061965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.628 [2024-11-30 00:06:29.061994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.062060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.628 [2024-11-30 00:06:29.062081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.062150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.628 [2024-11-30 00:06:29.062171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.062240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.628 [2024-11-30 00:06:29.062261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.628 #32 NEW cov: 11843 ft: 13546 corp: 11/375b lim: 50 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:03.628 [2024-11-30 00:06:29.102101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.628 [2024-11-30 00:06:29.102129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.102194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.628 [2024-11-30 00:06:29.102217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.102284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.628 [2024-11-30 00:06:29.102305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.102369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.628 [2024-11-30 00:06:29.102386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.628 #33 NEW cov: 11843 ft: 13561 corp: 12/420b lim: 50 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 ChangeBit- 00:08:03.628 [2024-11-30 00:06:29.141892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.628 [2024-11-30 00:06:29.141920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.141988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.628 [2024-11-30 00:06:29.142011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.628 #34 NEW cov: 11843 ft: 13661 corp: 13/448b lim: 50 exec/s: 0 rss: 69Mb L: 28/45 MS: 1 CrossOver- 00:08:03.628 [2024-11-30 00:06:29.182367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.628 [2024-11-30 00:06:29.182395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.182462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.628 [2024-11-30 00:06:29.182485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.182553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.628 [2024-11-30 00:06:29.182573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.628 [2024-11-30 00:06:29.182647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.628 [2024-11-30 00:06:29.182666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.887 #35 NEW cov: 11843 ft: 13686 corp: 14/496b lim: 50 exec/s: 0 rss: 69Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:03.887 [2024-11-30 00:06:29.222601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.887 [2024-11-30 00:06:29.222629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.222689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.887 [2024-11-30 00:06:29.222710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.222776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.887 [2024-11-30 00:06:29.222794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.222858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.887 [2024-11-30 00:06:29.222876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.222941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:03.887 [2024-11-30 00:06:29.222959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:03.887 #36 NEW cov: 11843 ft: 13745 corp: 15/546b lim: 50 exec/s: 0 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:08:03.887 [2024-11-30 00:06:29.262527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.887 [2024-11-30 00:06:29.262557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.262629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.887 [2024-11-30 00:06:29.262651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.262718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.887 [2024-11-30 00:06:29.262736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.262801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.887 [2024-11-30 00:06:29.262820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.887 #37 NEW cov: 11843 ft: 13762 corp: 16/595b lim: 50 exec/s: 0 rss: 69Mb L: 49/50 MS: 1 CopyPart- 00:08:03.887 [2024-11-30 00:06:29.302673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.887 [2024-11-30 00:06:29.302703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.302769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.887 [2024-11-30 00:06:29.302796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.302862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.887 [2024-11-30 00:06:29.302885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.302954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.887 [2024-11-30 00:06:29.302972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.887 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.887 #38 NEW cov: 11866 ft: 13808 corp: 17/638b lim: 50 exec/s: 0 rss: 69Mb L: 43/50 MS: 1 EraseBytes- 00:08:03.887 [2024-11-30 00:06:29.352858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.887 [2024-11-30 00:06:29.352888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.352954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.887 [2024-11-30 00:06:29.352975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.353040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.887 [2024-11-30 00:06:29.353060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.353124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.887 [2024-11-30 00:06:29.353143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.887 #39 NEW cov: 11866 ft: 13824 corp: 18/683b lim: 50 exec/s: 0 rss: 69Mb L: 45/50 MS: 1 InsertRepeatedBytes- 00:08:03.887 [2024-11-30 00:06:29.392914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.887 [2024-11-30 00:06:29.392943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.887 [2024-11-30 00:06:29.392996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.888 [2024-11-30 00:06:29.393017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.888 [2024-11-30 00:06:29.393083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.888 [2024-11-30 00:06:29.393102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.888 [2024-11-30 00:06:29.393166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.888 [2024-11-30 00:06:29.393184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.888 #40 NEW cov: 11866 ft: 13848 corp: 19/728b lim: 50 exec/s: 0 rss: 69Mb L: 45/50 MS: 1 ChangeBit- 00:08:03.888 [2024-11-30 00:06:29.432746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.888 [2024-11-30 00:06:29.432774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.888 [2024-11-30 00:06:29.432844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.888 [2024-11-30 00:06:29.432867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 #41 NEW cov: 11866 ft: 13884 corp: 20/750b lim: 50 exec/s: 41 rss: 69Mb L: 22/50 MS: 1 EraseBytes- 00:08:04.147 [2024-11-30 00:06:29.473168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.147 [2024-11-30 00:06:29.473196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.473260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.147 [2024-11-30 00:06:29.473281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.473346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.147 [2024-11-30 00:06:29.473365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.473430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.147 [2024-11-30 00:06:29.473449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.147 #42 NEW cov: 11866 ft: 13891 corp: 21/794b lim: 50 exec/s: 42 rss: 69Mb L: 44/50 MS: 1 CMP- DE: "\004\000\000\000"- 00:08:04.147 [2024-11-30 00:06:29.512982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.147 [2024-11-30 00:06:29.513010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.513081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.147 [2024-11-30 00:06:29.513103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 #43 NEW cov: 11866 ft: 13905 corp: 22/822b lim: 50 exec/s: 43 rss: 69Mb L: 28/50 MS: 1 ShuffleBytes- 00:08:04.147 [2024-11-30 00:06:29.553115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.147 [2024-11-30 00:06:29.553145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.553216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.147 [2024-11-30 00:06:29.553238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 #44 NEW cov: 11866 ft: 13958 corp: 23/850b lim: 50 exec/s: 44 rss: 69Mb L: 28/50 MS: 1 ChangeBit- 00:08:04.147 [2024-11-30 00:06:29.593502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.147 [2024-11-30 00:06:29.593531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.593596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.147 [2024-11-30 00:06:29.593623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.593689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.147 [2024-11-30 00:06:29.593710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.593774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.147 [2024-11-30 00:06:29.593792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.147 #45 NEW cov: 11866 ft: 13963 corp: 24/891b lim: 50 exec/s: 45 rss: 69Mb L: 41/50 MS: 1 InsertRepeatedBytes- 00:08:04.147 [2024-11-30 00:06:29.633617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.147 [2024-11-30 00:06:29.633645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.633706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.147 [2024-11-30 00:06:29.633728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.633792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.147 [2024-11-30 00:06:29.633810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.633873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.147 [2024-11-30 00:06:29.633892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.147 #46 NEW cov: 11866 ft: 13973 corp: 25/936b lim: 50 exec/s: 46 rss: 70Mb L: 45/50 MS: 1 ChangeBit- 00:08:04.147 [2024-11-30 00:06:29.673436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.147 [2024-11-30 00:06:29.673465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.147 [2024-11-30 00:06:29.673535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.147 [2024-11-30 00:06:29.673558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.147 #47 NEW cov: 11866 ft: 13988 corp: 26/963b lim: 50 exec/s: 47 rss: 70Mb L: 27/50 MS: 1 EraseBytes- 00:08:04.407 [2024-11-30 00:06:29.713852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.713881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.713948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.713970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.714035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.407 [2024-11-30 00:06:29.714054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.714119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.407 [2024-11-30 00:06:29.714137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.407 #48 NEW cov: 11866 ft: 14025 corp: 27/1007b lim: 50 exec/s: 48 rss: 70Mb L: 44/50 MS: 1 ShuffleBytes- 00:08:04.407 [2024-11-30 00:06:29.753978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.754007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.754073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.754093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.754157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.407 [2024-11-30 00:06:29.754176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.754247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.407 [2024-11-30 00:06:29.754266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.407 #49 NEW cov: 11866 ft: 14027 corp: 28/1052b lim: 50 exec/s: 49 rss: 70Mb L: 45/50 MS: 1 ShuffleBytes- 00:08:04.407 [2024-11-30 00:06:29.793942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.793970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.794041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.794062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.794129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.407 [2024-11-30 00:06:29.794148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.407 #50 NEW cov: 11866 ft: 14284 corp: 29/1083b lim: 50 exec/s: 50 rss: 70Mb L: 31/50 MS: 1 EraseBytes- 00:08:04.407 [2024-11-30 00:06:29.834217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.834244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.834307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.834329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.834395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.407 [2024-11-30 00:06:29.834416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.834480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.407 [2024-11-30 00:06:29.834500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.407 #51 NEW cov: 11866 ft: 14294 corp: 30/1130b lim: 50 exec/s: 51 rss: 70Mb L: 47/50 MS: 1 PersAutoDict- DE: "\004\000\000\000"- 00:08:04.407 [2024-11-30 00:06:29.874314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.874343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.874410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.874431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.874496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.407 [2024-11-30 00:06:29.874514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.874578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.407 [2024-11-30 00:06:29.874601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.407 #52 NEW cov: 11866 ft: 14308 corp: 31/1176b lim: 50 exec/s: 52 rss: 70Mb L: 46/50 MS: 1 InsertByte- 00:08:04.407 [2024-11-30 00:06:29.914135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.914167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.914237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.914258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.407 #53 NEW cov: 11866 ft: 14326 corp: 32/1202b lim: 50 exec/s: 53 rss: 70Mb L: 26/50 MS: 1 ShuffleBytes- 00:08:04.407 [2024-11-30 00:06:29.954270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.407 [2024-11-30 00:06:29.954299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.407 [2024-11-30 00:06:29.954368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.407 [2024-11-30 00:06:29.954391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 #54 NEW cov: 11866 ft: 14342 corp: 33/1228b lim: 50 exec/s: 54 rss: 70Mb L: 26/50 MS: 1 ChangeBit- 00:08:04.667 [2024-11-30 00:06:29.994669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.667 [2024-11-30 00:06:29.994698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:29.994764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.667 [2024-11-30 00:06:29.994785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:29.994850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.667 [2024-11-30 00:06:29.994868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:29.994934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.667 [2024-11-30 00:06:29.994953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.667 #55 NEW cov: 11866 ft: 14351 corp: 34/1269b lim: 50 exec/s: 55 rss: 70Mb L: 41/50 MS: 1 EraseBytes- 00:08:04.667 [2024-11-30 00:06:30.044541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.667 [2024-11-30 00:06:30.044574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.044646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.667 [2024-11-30 00:06:30.044668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 #56 NEW cov: 11866 ft: 14458 corp: 35/1297b lim: 50 exec/s: 56 rss: 70Mb L: 28/50 MS: 1 ChangeByte- 00:08:04.667 [2024-11-30 00:06:30.084818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.667 [2024-11-30 00:06:30.084852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.084918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.667 [2024-11-30 00:06:30.084940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.085006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.667 [2024-11-30 00:06:30.085029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.667 #57 NEW cov: 11866 ft: 14459 corp: 36/1329b lim: 50 exec/s: 57 rss: 70Mb L: 32/50 MS: 1 InsertByte- 00:08:04.667 [2024-11-30 00:06:30.125228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.667 [2024-11-30 00:06:30.125261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.125328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.667 [2024-11-30 00:06:30.125349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.125426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.667 [2024-11-30 00:06:30.125445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.125510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.667 [2024-11-30 00:06:30.125528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.125601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:04.667 [2024-11-30 00:06:30.125621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.667 #58 NEW cov: 11866 ft: 14552 corp: 37/1379b lim: 50 exec/s: 58 rss: 70Mb L: 50/50 MS: 1 ChangeByte- 00:08:04.667 [2024-11-30 00:06:30.165157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.667 [2024-11-30 00:06:30.165187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.165255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.667 [2024-11-30 00:06:30.165277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.165344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.667 [2024-11-30 00:06:30.165363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.165428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.667 [2024-11-30 00:06:30.165447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.667 #59 NEW cov: 11866 ft: 14576 corp: 38/1427b lim: 50 exec/s: 59 rss: 70Mb L: 48/50 MS: 1 PersAutoDict- DE: "\004\000\000\000"- 00:08:04.667 [2024-11-30 00:06:30.205291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.667 [2024-11-30 00:06:30.205320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.205387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.667 [2024-11-30 00:06:30.205409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.205474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.667 [2024-11-30 00:06:30.205493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.667 [2024-11-30 00:06:30.205558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.667 [2024-11-30 00:06:30.205583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.927 #60 NEW cov: 11866 ft: 14587 corp: 39/1468b lim: 50 exec/s: 60 rss: 70Mb L: 41/50 MS: 1 ChangeBit- 00:08:04.927 [2024-11-30 00:06:30.245419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.927 [2024-11-30 00:06:30.245447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.245511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.927 [2024-11-30 00:06:30.245532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.245603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.927 [2024-11-30 00:06:30.245622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.245689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.927 [2024-11-30 00:06:30.245708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.927 #61 NEW cov: 11866 ft: 14597 corp: 40/1512b lim: 50 exec/s: 61 rss: 70Mb L: 44/50 MS: 1 ShuffleBytes- 00:08:04.927 [2024-11-30 00:06:30.285529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.927 [2024-11-30 00:06:30.285559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.285634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.927 [2024-11-30 00:06:30.285659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.285725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.927 [2024-11-30 00:06:30.285755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.285818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.927 [2024-11-30 00:06:30.285852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.927 #62 NEW cov: 11866 ft: 14619 corp: 41/1554b lim: 50 exec/s: 62 rss: 70Mb L: 42/50 MS: 1 InsertByte- 00:08:04.927 [2024-11-30 00:06:30.325675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.927 [2024-11-30 00:06:30.325705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.325771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.927 [2024-11-30 00:06:30.325793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.927 [2024-11-30 00:06:30.325861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.927 [2024-11-30 00:06:30.325886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.325954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.928 [2024-11-30 00:06:30.325974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.928 #63 NEW cov: 11866 ft: 14629 corp: 42/1597b lim: 50 exec/s: 63 rss: 70Mb L: 43/50 MS: 1 CopyPart- 00:08:04.928 [2024-11-30 00:06:30.365788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.928 [2024-11-30 00:06:30.365817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.365885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.928 [2024-11-30 00:06:30.365906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.365972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.928 [2024-11-30 00:06:30.365991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.366058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.928 [2024-11-30 00:06:30.366077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.928 #64 NEW cov: 11866 ft: 14644 corp: 43/1638b lim: 50 exec/s: 64 rss: 70Mb L: 41/50 MS: 1 ShuffleBytes- 00:08:04.928 [2024-11-30 00:06:30.405905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:04.928 [2024-11-30 00:06:30.405934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.406000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:04.928 [2024-11-30 00:06:30.406023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.406090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:04.928 [2024-11-30 00:06:30.406108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.928 [2024-11-30 00:06:30.406174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:04.928 [2024-11-30 00:06:30.406193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.928 #65 NEW cov: 11866 ft: 14649 corp: 44/1679b lim: 50 exec/s: 32 rss: 70Mb L: 41/50 MS: 1 CMP- DE: "\001\224G\013\327o\370\364"- 00:08:04.928 #65 DONE cov: 11866 ft: 14649 corp: 44/1679b lim: 50 exec/s: 32 rss: 70Mb 00:08:04.928 ###### Recommended dictionary. ###### 00:08:04.928 "\004\000\000\000" # Uses: 2 00:08:04.928 "\001\224G\013\327o\370\364" # Uses: 0 00:08:04.928 ###### End of recommended dictionary. ###### 00:08:04.928 Done 65 runs in 2 second(s) 00:08:05.188 00:06:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:05.188 00:06:30 -- ../common.sh@72 -- # (( i++ )) 00:08:05.188 00:06:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.188 00:06:30 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:05.188 00:06:30 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:05.188 00:06:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.188 00:06:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.188 00:06:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:05.188 00:06:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:05.189 00:06:30 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:05.189 00:06:30 -- nvmf/run.sh@29 -- # port=4422 00:08:05.189 00:06:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:05.189 00:06:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:05.189 00:06:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.189 00:06:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:05.189 [2024-11-30 00:06:30.591732] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:05.189 [2024-11-30 00:06:30.591801] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2731069 ] 00:08:05.189 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.448 [2024-11-30 00:06:30.782651] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.448 [2024-11-30 00:06:30.848753] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.448 [2024-11-30 00:06:30.848876] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.448 [2024-11-30 00:06:30.907587] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.448 [2024-11-30 00:06:30.924000] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:05.448 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.448 INFO: Seed: 1806521788 00:08:05.448 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:05.448 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:05.448 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:05.448 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.448 #2 INITED exec/s: 0 rss: 61Mb 00:08:05.449 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.449 This may also happen if the target rejected all inputs we tried so far 00:08:05.449 [2024-11-30 00:06:30.969016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.449 [2024-11-30 00:06:30.969048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 NEW_FUNC[1/672]: 0x461ed8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:06.024 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.024 #6 NEW cov: 11665 ft: 11666 corp: 2/24b lim: 85 exec/s: 0 rss: 68Mb L: 23/23 MS: 4 ShuffleBytes-ChangeBit-ChangeBinInt-InsertRepeatedBytes- 00:08:06.024 [2024-11-30 00:06:31.300607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.300661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 #7 NEW cov: 11778 ft: 12449 corp: 3/47b lim: 85 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 CopyPart- 00:08:06.024 [2024-11-30 00:06:31.350545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.350573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 #8 NEW cov: 11784 ft: 12665 corp: 4/70b lim: 85 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:06.024 [2024-11-30 00:06:31.400700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.400731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 #14 NEW cov: 11869 ft: 13041 corp: 5/93b lim: 85 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:06.024 [2024-11-30 00:06:31.440756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.440786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 #15 NEW cov: 11869 ft: 13093 corp: 6/117b lim: 85 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 InsertByte- 00:08:06.024 [2024-11-30 00:06:31.480936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.480968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 #16 NEW cov: 11869 ft: 13164 corp: 7/141b lim: 85 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 InsertByte- 00:08:06.024 [2024-11-30 00:06:31.520964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.520992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.024 #22 NEW cov: 11869 ft: 13257 corp: 8/165b lim: 85 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ChangeByte- 00:08:06.024 [2024-11-30 00:06:31.561154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.024 [2024-11-30 00:06:31.561182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 #23 NEW cov: 11869 ft: 13345 corp: 9/184b lim: 85 exec/s: 0 rss: 68Mb L: 19/24 MS: 1 EraseBytes- 00:08:06.283 [2024-11-30 00:06:31.601378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.283 [2024-11-30 00:06:31.601411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 #24 NEW cov: 11869 ft: 13392 corp: 10/208b lim: 85 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ChangeBit- 00:08:06.283 [2024-11-30 00:06:31.641412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.283 [2024-11-30 00:06:31.641445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 #25 NEW cov: 11869 ft: 13441 corp: 11/231b lim: 85 exec/s: 0 rss: 68Mb L: 23/24 MS: 1 ChangeBit- 00:08:06.283 [2024-11-30 00:06:31.681615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.283 [2024-11-30 00:06:31.681662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 #26 NEW cov: 11869 ft: 13467 corp: 12/254b lim: 85 exec/s: 0 rss: 68Mb L: 23/24 MS: 1 ShuffleBytes- 00:08:06.283 [2024-11-30 00:06:31.722333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.283 [2024-11-30 00:06:31.722365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 [2024-11-30 00:06:31.722461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:06.283 [2024-11-30 00:06:31.722485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.283 [2024-11-30 00:06:31.722604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:06.283 [2024-11-30 00:06:31.722626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.283 [2024-11-30 00:06:31.722738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:06.283 [2024-11-30 00:06:31.722759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.283 #27 NEW cov: 11869 ft: 14359 corp: 13/322b lim: 85 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:08:06.283 [2024-11-30 00:06:31.771844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.283 [2024-11-30 00:06:31.771870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 #28 NEW cov: 11869 ft: 14397 corp: 14/346b lim: 85 exec/s: 0 rss: 68Mb L: 24/68 MS: 1 ChangeByte- 00:08:06.283 [2024-11-30 00:06:31.811947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.283 [2024-11-30 00:06:31.811973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.283 #29 NEW cov: 11869 ft: 14441 corp: 15/369b lim: 85 exec/s: 0 rss: 68Mb L: 23/68 MS: 1 ShuffleBytes- 00:08:06.543 [2024-11-30 00:06:31.851980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.543 [2024-11-30 00:06:31.852012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.543 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.543 #30 NEW cov: 11892 ft: 14477 corp: 16/392b lim: 85 exec/s: 0 rss: 69Mb L: 23/68 MS: 1 ShuffleBytes- 00:08:06.543 [2024-11-30 00:06:31.892590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.543 [2024-11-30 00:06:31.892637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.543 [2024-11-30 00:06:31.892748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:06.543 [2024-11-30 00:06:31.892767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.543 [2024-11-30 00:06:31.892890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:06.543 [2024-11-30 00:06:31.892907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.543 #31 NEW cov: 11892 ft: 14786 corp: 17/452b lim: 85 exec/s: 0 rss: 69Mb L: 60/68 MS: 1 InsertRepeatedBytes- 00:08:06.543 [2024-11-30 00:06:31.943146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.543 [2024-11-30 00:06:31.943176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.543 [2024-11-30 00:06:31.943288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:06.543 [2024-11-30 00:06:31.943309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.543 [2024-11-30 00:06:31.943422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:06.543 [2024-11-30 00:06:31.943448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.543 [2024-11-30 00:06:31.943559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:06.543 [2024-11-30 00:06:31.943583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.543 #32 NEW cov: 11892 ft: 14831 corp: 18/520b lim: 85 exec/s: 32 rss: 69Mb L: 68/68 MS: 1 ShuffleBytes- 00:08:06.543 [2024-11-30 00:06:31.992404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.543 [2024-11-30 00:06:31.992433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.543 #33 NEW cov: 11892 ft: 14834 corp: 19/544b lim: 85 exec/s: 33 rss: 69Mb L: 24/68 MS: 1 ChangeBinInt- 00:08:06.543 [2024-11-30 00:06:32.032567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.543 [2024-11-30 00:06:32.032596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.543 #34 NEW cov: 11892 ft: 14842 corp: 20/577b lim: 85 exec/s: 34 rss: 69Mb L: 33/68 MS: 1 CrossOver- 00:08:06.543 [2024-11-30 00:06:32.072779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.543 [2024-11-30 00:06:32.072807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.801 #35 NEW cov: 11892 ft: 14857 corp: 21/600b lim: 85 exec/s: 35 rss: 69Mb L: 23/68 MS: 1 ShuffleBytes- 00:08:06.801 [2024-11-30 00:06:32.112841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.801 [2024-11-30 00:06:32.112866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.801 #36 NEW cov: 11892 ft: 14871 corp: 22/623b lim: 85 exec/s: 36 rss: 69Mb L: 23/68 MS: 1 ChangeBit- 00:08:06.801 [2024-11-30 00:06:32.152981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.801 [2024-11-30 00:06:32.153006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.801 #37 NEW cov: 11892 ft: 14907 corp: 23/646b lim: 85 exec/s: 37 rss: 69Mb L: 23/68 MS: 1 ChangeBinInt- 00:08:06.801 [2024-11-30 00:06:32.193121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.801 [2024-11-30 00:06:32.193146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.801 #38 NEW cov: 11892 ft: 14915 corp: 24/670b lim: 85 exec/s: 38 rss: 69Mb L: 24/68 MS: 1 CopyPart- 00:08:06.801 [2024-11-30 00:06:32.233104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.802 [2024-11-30 00:06:32.233136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.802 #39 NEW cov: 11892 ft: 14935 corp: 25/693b lim: 85 exec/s: 39 rss: 69Mb L: 23/68 MS: 1 ChangeByte- 00:08:06.802 [2024-11-30 00:06:32.273249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.802 [2024-11-30 00:06:32.273274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.802 #40 NEW cov: 11892 ft: 14963 corp: 26/717b lim: 85 exec/s: 40 rss: 69Mb L: 24/68 MS: 1 CrossOver- 00:08:06.802 [2024-11-30 00:06:32.313453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.802 [2024-11-30 00:06:32.313478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.802 #46 NEW cov: 11892 ft: 14968 corp: 27/734b lim: 85 exec/s: 46 rss: 70Mb L: 17/68 MS: 1 EraseBytes- 00:08:06.802 [2024-11-30 00:06:32.353552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:06.802 [2024-11-30 00:06:32.353578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.060 #47 NEW cov: 11892 ft: 14981 corp: 28/757b lim: 85 exec/s: 47 rss: 70Mb L: 23/68 MS: 1 ChangeBinInt- 00:08:07.060 [2024-11-30 00:06:32.393672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.060 [2024-11-30 00:06:32.393706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.060 #48 NEW cov: 11892 ft: 14987 corp: 29/780b lim: 85 exec/s: 48 rss: 70Mb L: 23/68 MS: 1 ChangeBit- 00:08:07.060 [2024-11-30 00:06:32.434055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.060 [2024-11-30 00:06:32.434087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.060 [2024-11-30 00:06:32.434205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.060 [2024-11-30 00:06:32.434229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.061 #49 NEW cov: 11892 ft: 15273 corp: 30/823b lim: 85 exec/s: 49 rss: 70Mb L: 43/68 MS: 1 CrossOver- 00:08:07.061 [2024-11-30 00:06:32.474620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.061 [2024-11-30 00:06:32.474651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.061 [2024-11-30 00:06:32.474735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.061 [2024-11-30 00:06:32.474758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.061 [2024-11-30 00:06:32.474869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:07.061 [2024-11-30 00:06:32.474893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.061 [2024-11-30 00:06:32.475016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:07.061 [2024-11-30 00:06:32.475038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.061 #53 NEW cov: 11892 ft: 15293 corp: 31/895b lim: 85 exec/s: 53 rss: 70Mb L: 72/72 MS: 4 EraseBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:07.061 [2024-11-30 00:06:32.514034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.061 [2024-11-30 00:06:32.514060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.061 #54 NEW cov: 11892 ft: 15302 corp: 32/918b lim: 85 exec/s: 54 rss: 70Mb L: 23/72 MS: 1 ChangeBinInt- 00:08:07.061 [2024-11-30 00:06:32.554191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.061 [2024-11-30 00:06:32.554217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.061 #55 NEW cov: 11892 ft: 15305 corp: 33/941b lim: 85 exec/s: 55 rss: 70Mb L: 23/72 MS: 1 ChangeBit- 00:08:07.061 [2024-11-30 00:06:32.594228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.061 [2024-11-30 00:06:32.594256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.061 #56 NEW cov: 11892 ft: 15314 corp: 34/970b lim: 85 exec/s: 56 rss: 70Mb L: 29/72 MS: 1 InsertRepeatedBytes- 00:08:07.320 [2024-11-30 00:06:32.634648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.320 [2024-11-30 00:06:32.634685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.320 [2024-11-30 00:06:32.634810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.320 [2024-11-30 00:06:32.634832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.320 #58 NEW cov: 11892 ft: 15390 corp: 35/1010b lim: 85 exec/s: 58 rss: 70Mb L: 40/72 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:07.320 [2024-11-30 00:06:32.674549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.320 [2024-11-30 00:06:32.674578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.320 #59 NEW cov: 11892 ft: 15418 corp: 36/1034b lim: 85 exec/s: 59 rss: 70Mb L: 24/72 MS: 1 ShuffleBytes- 00:08:07.320 [2024-11-30 00:06:32.714707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.320 [2024-11-30 00:06:32.714734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.320 #60 NEW cov: 11892 ft: 15434 corp: 37/1053b lim: 85 exec/s: 60 rss: 70Mb L: 19/72 MS: 1 ShuffleBytes- 00:08:07.320 [2024-11-30 00:06:32.755538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.320 [2024-11-30 00:06:32.755569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.320 [2024-11-30 00:06:32.755639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.320 [2024-11-30 00:06:32.755664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.320 [2024-11-30 00:06:32.755775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:07.320 [2024-11-30 00:06:32.755798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.320 [2024-11-30 00:06:32.755919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:07.320 [2024-11-30 00:06:32.755943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.320 #61 NEW cov: 11892 ft: 15436 corp: 38/1133b lim: 85 exec/s: 61 rss: 70Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:08:07.320 [2024-11-30 00:06:32.794950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.320 [2024-11-30 00:06:32.794980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.320 #62 NEW cov: 11892 ft: 15463 corp: 39/1150b lim: 85 exec/s: 62 rss: 70Mb L: 17/80 MS: 1 ChangeBinInt- 00:08:07.320 [2024-11-30 00:06:32.835336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.320 [2024-11-30 00:06:32.835363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.320 [2024-11-30 00:06:32.835479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.320 [2024-11-30 00:06:32.835502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.320 #63 NEW cov: 11892 ft: 15471 corp: 40/1190b lim: 85 exec/s: 63 rss: 70Mb L: 40/80 MS: 1 ChangeByte- 00:08:07.580 [2024-11-30 00:06:32.885199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.580 [2024-11-30 00:06:32.885227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.580 #64 NEW cov: 11892 ft: 15475 corp: 41/1209b lim: 85 exec/s: 64 rss: 70Mb L: 19/80 MS: 1 ChangeByte- 00:08:07.580 [2024-11-30 00:06:32.926007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.580 [2024-11-30 00:06:32.926041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.926161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.580 [2024-11-30 00:06:32.926188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.926299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:07.580 [2024-11-30 00:06:32.926320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.926433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:07.580 [2024-11-30 00:06:32.926453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.966104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.580 [2024-11-30 00:06:32.966136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.966235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.580 [2024-11-30 00:06:32.966258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.966391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:07.580 [2024-11-30 00:06:32.966411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.580 [2024-11-30 00:06:32.966528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:07.580 [2024-11-30 00:06:32.966553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.580 #66 NEW cov: 11892 ft: 15498 corp: 42/1293b lim: 85 exec/s: 33 rss: 70Mb L: 84/84 MS: 2 InsertRepeatedBytes-ShuffleBytes- 00:08:07.580 #66 DONE cov: 11892 ft: 15498 corp: 42/1293b lim: 85 exec/s: 33 rss: 70Mb 00:08:07.580 Done 66 runs in 2 second(s) 00:08:07.580 00:06:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:07.580 00:06:33 -- ../common.sh@72 -- # (( i++ )) 00:08:07.580 00:06:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.580 00:06:33 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:07.580 00:06:33 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:07.580 00:06:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.580 00:06:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.580 00:06:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:07.580 00:06:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:07.580 00:06:33 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:07.580 00:06:33 -- nvmf/run.sh@29 -- # port=4423 00:08:07.580 00:06:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:07.580 00:06:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:07.580 00:06:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.580 00:06:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:07.839 [2024-11-30 00:06:33.152301] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:07.839 [2024-11-30 00:06:33.152368] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2731606 ] 00:08:07.839 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.839 [2024-11-30 00:06:33.328028] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.839 [2024-11-30 00:06:33.390526] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.839 [2024-11-30 00:06:33.390674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.097 [2024-11-30 00:06:33.448762] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.097 [2024-11-30 00:06:33.465127] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:08.097 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.097 INFO: Seed: 51548380 00:08:08.097 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:08.097 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:08.097 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:08.097 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.097 #2 INITED exec/s: 0 rss: 60Mb 00:08:08.097 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.097 This may also happen if the target rejected all inputs we tried so far 00:08:08.097 [2024-11-30 00:06:33.513842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.097 [2024-11-30 00:06:33.513874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.356 NEW_FUNC[1/669]: 0x465118 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:08.356 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.356 #5 NEW cov: 11564 ft: 11597 corp: 2/10b lim: 25 exec/s: 0 rss: 68Mb L: 9/9 MS: 3 CrossOver-ChangeByte-CMP- DE: "\377\223G\015\245\273\314\342"- 00:08:08.356 [2024-11-30 00:06:33.814886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.356 [2024-11-30 00:06:33.814923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.356 [2024-11-30 00:06:33.814991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.356 [2024-11-30 00:06:33.815014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.356 [2024-11-30 00:06:33.815079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.356 [2024-11-30 00:06:33.815114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.356 [2024-11-30 00:06:33.815182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:08.356 [2024-11-30 00:06:33.815205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.356 NEW_FUNC[1/2]: 0xebaa48 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:415 00:08:08.356 NEW_FUNC[2/2]: 0x194d438 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:08:08.356 #8 NEW cov: 11711 ft: 12651 corp: 3/30b lim: 25 exec/s: 0 rss: 68Mb L: 20/20 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:08.356 [2024-11-30 00:06:33.854689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.356 [2024-11-30 00:06:33.854719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.356 [2024-11-30 00:06:33.854787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.356 [2024-11-30 00:06:33.854810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.356 #12 NEW cov: 11717 ft: 13050 corp: 4/40b lim: 25 exec/s: 0 rss: 68Mb L: 10/20 MS: 4 ChangeBit-ChangeByte-InsertByte-PersAutoDict- DE: "\377\223G\015\245\273\314\342"- 00:08:08.356 [2024-11-30 00:06:33.894820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.356 [2024-11-30 00:06:33.894849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.356 [2024-11-30 00:06:33.894918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.356 [2024-11-30 00:06:33.894941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.614 #13 NEW cov: 11802 ft: 13278 corp: 5/54b lim: 25 exec/s: 0 rss: 68Mb L: 14/20 MS: 1 CrossOver- 00:08:08.614 [2024-11-30 00:06:33.935191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.614 [2024-11-30 00:06:33.935220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.614 [2024-11-30 00:06:33.935280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.614 [2024-11-30 00:06:33.935302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.614 [2024-11-30 00:06:33.935368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.614 [2024-11-30 00:06:33.935389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.614 [2024-11-30 00:06:33.935454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:08.614 [2024-11-30 00:06:33.935475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.614 #14 NEW cov: 11802 ft: 13371 corp: 6/77b lim: 25 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:08.614 [2024-11-30 00:06:33.975278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.614 [2024-11-30 00:06:33.975307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.614 [2024-11-30 00:06:33.975366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.614 [2024-11-30 00:06:33.975387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.614 [2024-11-30 00:06:33.975455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.614 [2024-11-30 00:06:33.975475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.614 [2024-11-30 00:06:33.975542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:08.614 [2024-11-30 00:06:33.975562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.614 #15 NEW cov: 11802 ft: 13449 corp: 7/100b lim: 25 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:08.614 [2024-11-30 00:06:34.015047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.614 [2024-11-30 00:06:34.015077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.614 #16 NEW cov: 11802 ft: 13574 corp: 8/108b lim: 25 exec/s: 0 rss: 68Mb L: 8/23 MS: 1 InsertRepeatedBytes- 00:08:08.614 [2024-11-30 00:06:34.055500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.614 [2024-11-30 00:06:34.055529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.055592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.615 [2024-11-30 00:06:34.055619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.055683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.615 [2024-11-30 00:06:34.055701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.055766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:08.615 [2024-11-30 00:06:34.055785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.615 #17 NEW cov: 11802 ft: 13626 corp: 9/130b lim: 25 exec/s: 0 rss: 68Mb L: 22/23 MS: 1 PersAutoDict- DE: "\377\223G\015\245\273\314\342"- 00:08:08.615 [2024-11-30 00:06:34.095539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.615 [2024-11-30 00:06:34.095567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.095637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.615 [2024-11-30 00:06:34.095660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.095726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.615 [2024-11-30 00:06:34.095748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.615 #18 NEW cov: 11802 ft: 13867 corp: 10/145b lim: 25 exec/s: 0 rss: 68Mb L: 15/23 MS: 1 InsertByte- 00:08:08.615 [2024-11-30 00:06:34.135749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.615 [2024-11-30 00:06:34.135777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.135837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.615 [2024-11-30 00:06:34.135859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.135923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.615 [2024-11-30 00:06:34.135944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.615 [2024-11-30 00:06:34.136008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:08.615 [2024-11-30 00:06:34.136027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.615 #19 NEW cov: 11802 ft: 13889 corp: 11/165b lim: 25 exec/s: 0 rss: 68Mb L: 20/23 MS: 1 EraseBytes- 00:08:08.873 [2024-11-30 00:06:34.175683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.175712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 [2024-11-30 00:06:34.175781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.873 [2024-11-30 00:06:34.175801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.873 #20 NEW cov: 11802 ft: 13918 corp: 12/179b lim: 25 exec/s: 0 rss: 69Mb L: 14/23 MS: 1 CopyPart- 00:08:08.873 [2024-11-30 00:06:34.215787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.215815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 [2024-11-30 00:06:34.215884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.873 [2024-11-30 00:06:34.215907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.873 #21 NEW cov: 11802 ft: 14041 corp: 13/190b lim: 25 exec/s: 0 rss: 69Mb L: 11/23 MS: 1 InsertRepeatedBytes- 00:08:08.873 [2024-11-30 00:06:34.255771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.255800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 #22 NEW cov: 11802 ft: 14165 corp: 14/196b lim: 25 exec/s: 0 rss: 69Mb L: 6/23 MS: 1 EraseBytes- 00:08:08.873 [2024-11-30 00:06:34.295888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.295917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 #23 NEW cov: 11802 ft: 14336 corp: 15/202b lim: 25 exec/s: 0 rss: 69Mb L: 6/23 MS: 1 ChangeBinInt- 00:08:08.873 [2024-11-30 00:06:34.336042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.336071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 #24 NEW cov: 11802 ft: 14444 corp: 16/208b lim: 25 exec/s: 0 rss: 69Mb L: 6/23 MS: 1 CMP- DE: "\000\000"- 00:08:08.873 [2024-11-30 00:06:34.376271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.376300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 [2024-11-30 00:06:34.376386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.873 [2024-11-30 00:06:34.376407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.873 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.873 #25 NEW cov: 11825 ft: 14473 corp: 17/219b lim: 25 exec/s: 0 rss: 69Mb L: 11/23 MS: 1 ShuffleBytes- 00:08:08.873 [2024-11-30 00:06:34.416499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.873 [2024-11-30 00:06:34.416528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.873 [2024-11-30 00:06:34.416593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.873 [2024-11-30 00:06:34.416619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.873 [2024-11-30 00:06:34.416686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:08.873 [2024-11-30 00:06:34.416707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.131 #26 NEW cov: 11825 ft: 14505 corp: 18/235b lim: 25 exec/s: 0 rss: 69Mb L: 16/23 MS: 1 CrossOver- 00:08:09.131 [2024-11-30 00:06:34.456389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.456419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.132 #27 NEW cov: 11825 ft: 14543 corp: 19/243b lim: 25 exec/s: 0 rss: 69Mb L: 8/23 MS: 1 ShuffleBytes- 00:08:09.132 [2024-11-30 00:06:34.496861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.496890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.496945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.132 [2024-11-30 00:06:34.496966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.497029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.132 [2024-11-30 00:06:34.497047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.497112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.132 [2024-11-30 00:06:34.497134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.132 #28 NEW cov: 11825 ft: 14565 corp: 20/267b lim: 25 exec/s: 28 rss: 69Mb L: 24/24 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:09.132 [2024-11-30 00:06:34.536944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.536973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.537036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.132 [2024-11-30 00:06:34.537057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.537121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.132 [2024-11-30 00:06:34.537139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.537203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.132 [2024-11-30 00:06:34.537221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.132 #29 NEW cov: 11825 ft: 14583 corp: 21/287b lim: 25 exec/s: 29 rss: 69Mb L: 20/24 MS: 1 ChangeBit- 00:08:09.132 [2024-11-30 00:06:34.576706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.576733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.132 #30 NEW cov: 11825 ft: 14599 corp: 22/295b lim: 25 exec/s: 30 rss: 69Mb L: 8/24 MS: 1 ChangeBinInt- 00:08:09.132 [2024-11-30 00:06:34.616956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.616984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.617051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.132 [2024-11-30 00:06:34.617072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.132 #32 NEW cov: 11825 ft: 14625 corp: 23/305b lim: 25 exec/s: 32 rss: 69Mb L: 10/24 MS: 2 InsertByte-CMP- DE: "\377\377\377\377\377\377\002\325"- 00:08:09.132 [2024-11-30 00:06:34.647214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.647242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.647302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.132 [2024-11-30 00:06:34.647324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.647390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.132 [2024-11-30 00:06:34.647412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.132 [2024-11-30 00:06:34.647477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.132 [2024-11-30 00:06:34.647496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.132 #33 NEW cov: 11825 ft: 14665 corp: 24/328b lim: 25 exec/s: 33 rss: 69Mb L: 23/24 MS: 1 ChangeBinInt- 00:08:09.132 [2024-11-30 00:06:34.687031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.132 [2024-11-30 00:06:34.687062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 #34 NEW cov: 11825 ft: 14670 corp: 25/336b lim: 25 exec/s: 34 rss: 70Mb L: 8/24 MS: 1 ChangeBit- 00:08:09.391 [2024-11-30 00:06:34.717444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.391 [2024-11-30 00:06:34.717472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.717534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.391 [2024-11-30 00:06:34.717555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.717625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.391 [2024-11-30 00:06:34.717645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.717711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.391 [2024-11-30 00:06:34.717729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.391 #35 NEW cov: 11825 ft: 14686 corp: 26/356b lim: 25 exec/s: 35 rss: 70Mb L: 20/24 MS: 1 CrossOver- 00:08:09.391 [2024-11-30 00:06:34.757430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.391 [2024-11-30 00:06:34.757458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.757523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.391 [2024-11-30 00:06:34.757545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.757617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.391 [2024-11-30 00:06:34.757637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.391 #36 NEW cov: 11825 ft: 14704 corp: 27/373b lim: 25 exec/s: 36 rss: 70Mb L: 17/24 MS: 1 CrossOver- 00:08:09.391 [2024-11-30 00:06:34.797330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.391 [2024-11-30 00:06:34.797358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 #37 NEW cov: 11825 ft: 14733 corp: 28/378b lim: 25 exec/s: 37 rss: 70Mb L: 5/24 MS: 1 EraseBytes- 00:08:09.391 [2024-11-30 00:06:34.837463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.391 [2024-11-30 00:06:34.837491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 #38 NEW cov: 11825 ft: 14735 corp: 29/387b lim: 25 exec/s: 38 rss: 70Mb L: 9/24 MS: 1 ChangeBit- 00:08:09.391 [2024-11-30 00:06:34.877656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.391 [2024-11-30 00:06:34.877683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.877748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.391 [2024-11-30 00:06:34.877772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.391 #42 NEW cov: 11825 ft: 14813 corp: 30/400b lim: 25 exec/s: 42 rss: 70Mb L: 13/24 MS: 4 EraseBytes-PersAutoDict-ChangeBinInt-InsertRepeatedBytes- DE: "\001\000\000\000"- 00:08:09.391 [2024-11-30 00:06:34.917887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.391 [2024-11-30 00:06:34.917916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.917980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.391 [2024-11-30 00:06:34.918001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.391 [2024-11-30 00:06:34.918065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.391 [2024-11-30 00:06:34.918088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.391 #43 NEW cov: 11825 ft: 14836 corp: 31/417b lim: 25 exec/s: 43 rss: 70Mb L: 17/24 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:09.661 [2024-11-30 00:06:34.958166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:34.958195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:34.958254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.661 [2024-11-30 00:06:34.958275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:34.958340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.661 [2024-11-30 00:06:34.958359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:34.958424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.661 [2024-11-30 00:06:34.958442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.661 #44 NEW cov: 11825 ft: 14854 corp: 32/441b lim: 25 exec/s: 44 rss: 70Mb L: 24/24 MS: 1 InsertByte- 00:08:09.661 [2024-11-30 00:06:34.998271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:34.998300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:34.998357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.661 [2024-11-30 00:06:34.998378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:34.998442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.661 [2024-11-30 00:06:34.998461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:34.998526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.661 [2024-11-30 00:06:34.998544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.661 #45 NEW cov: 11825 ft: 14913 corp: 33/465b lim: 25 exec/s: 45 rss: 70Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:09.661 [2024-11-30 00:06:35.038054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:35.038082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 #46 NEW cov: 11825 ft: 14917 corp: 34/471b lim: 25 exec/s: 46 rss: 70Mb L: 6/24 MS: 1 ShuffleBytes- 00:08:09.661 [2024-11-30 00:06:35.078365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:35.078395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:35.078460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.661 [2024-11-30 00:06:35.078480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:35.078547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.661 [2024-11-30 00:06:35.078566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.661 #47 NEW cov: 11825 ft: 14925 corp: 35/490b lim: 25 exec/s: 47 rss: 70Mb L: 19/24 MS: 1 InsertRepeatedBytes- 00:08:09.661 [2024-11-30 00:06:35.118602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:35.118630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:35.118682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.661 [2024-11-30 00:06:35.118704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:35.118769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.661 [2024-11-30 00:06:35.118788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:35.118852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.661 [2024-11-30 00:06:35.118871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.661 #48 NEW cov: 11825 ft: 14932 corp: 36/513b lim: 25 exec/s: 48 rss: 70Mb L: 23/24 MS: 1 CrossOver- 00:08:09.661 [2024-11-30 00:06:35.158329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:35.158357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 #49 NEW cov: 11825 ft: 14980 corp: 37/519b lim: 25 exec/s: 49 rss: 70Mb L: 6/24 MS: 1 ShuffleBytes- 00:08:09.661 [2024-11-30 00:06:35.198552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.661 [2024-11-30 00:06:35.198581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.661 [2024-11-30 00:06:35.198657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.661 [2024-11-30 00:06:35.198680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.921 #50 NEW cov: 11825 ft: 14985 corp: 38/529b lim: 25 exec/s: 50 rss: 70Mb L: 10/24 MS: 1 CMP- DE: "\213\000\000\000"- 00:08:09.921 [2024-11-30 00:06:35.238908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.238936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.238996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.921 [2024-11-30 00:06:35.239017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.239081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.921 [2024-11-30 00:06:35.239100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.239163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.921 [2024-11-30 00:06:35.239181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.921 #51 NEW cov: 11825 ft: 15001 corp: 39/551b lim: 25 exec/s: 51 rss: 70Mb L: 22/24 MS: 1 ChangeBit- 00:08:09.921 [2024-11-30 00:06:35.278652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.278680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 #52 NEW cov: 11825 ft: 15007 corp: 40/557b lim: 25 exec/s: 52 rss: 70Mb L: 6/24 MS: 1 ChangeByte- 00:08:09.921 [2024-11-30 00:06:35.309068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.309096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.309156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.921 [2024-11-30 00:06:35.309176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.309241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.921 [2024-11-30 00:06:35.309260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.309324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.921 [2024-11-30 00:06:35.309342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.921 #53 NEW cov: 11825 ft: 15012 corp: 41/580b lim: 25 exec/s: 53 rss: 70Mb L: 23/24 MS: 1 ShuffleBytes- 00:08:09.921 [2024-11-30 00:06:35.348850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.348878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 #54 NEW cov: 11825 ft: 15041 corp: 42/586b lim: 25 exec/s: 54 rss: 70Mb L: 6/24 MS: 1 ChangeByte- 00:08:09.921 [2024-11-30 00:06:35.389378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.389406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.389469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.921 [2024-11-30 00:06:35.389490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.389554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:09.921 [2024-11-30 00:06:35.389573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.389643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:09.921 [2024-11-30 00:06:35.389663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.921 #55 NEW cov: 11825 ft: 15081 corp: 43/609b lim: 25 exec/s: 55 rss: 70Mb L: 23/24 MS: 1 ChangeBinInt- 00:08:09.921 [2024-11-30 00:06:35.429132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.429160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 #56 NEW cov: 11825 ft: 15094 corp: 44/616b lim: 25 exec/s: 56 rss: 70Mb L: 7/24 MS: 1 InsertByte- 00:08:09.921 [2024-11-30 00:06:35.469328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:09.921 [2024-11-30 00:06:35.469356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.921 [2024-11-30 00:06:35.469426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:09.921 [2024-11-30 00:06:35.469449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.180 #57 NEW cov: 11825 ft: 15105 corp: 45/627b lim: 25 exec/s: 57 rss: 70Mb L: 11/24 MS: 1 CrossOver- 00:08:10.180 [2024-11-30 00:06:35.509722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.180 [2024-11-30 00:06:35.509751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.180 [2024-11-30 00:06:35.509813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:10.180 [2024-11-30 00:06:35.509834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.180 [2024-11-30 00:06:35.509899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:10.180 [2024-11-30 00:06:35.509918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.180 [2024-11-30 00:06:35.509981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:10.180 [2024-11-30 00:06:35.510000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.180 #58 NEW cov: 11825 ft: 15116 corp: 46/648b lim: 25 exec/s: 29 rss: 70Mb L: 21/24 MS: 1 InsertRepeatedBytes- 00:08:10.180 #58 DONE cov: 11825 ft: 15116 corp: 46/648b lim: 25 exec/s: 29 rss: 70Mb 00:08:10.180 ###### Recommended dictionary. ###### 00:08:10.180 "\377\223G\015\245\273\314\342" # Uses: 2 00:08:10.180 "\000\000" # Uses: 0 00:08:10.180 "\001\000\000\000" # Uses: 2 00:08:10.180 "\377\377\377\377\377\377\002\325" # Uses: 0 00:08:10.180 "\213\000\000\000" # Uses: 0 00:08:10.180 ###### End of recommended dictionary. ###### 00:08:10.180 Done 58 runs in 2 second(s) 00:08:10.180 00:06:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:10.180 00:06:35 -- ../common.sh@72 -- # (( i++ )) 00:08:10.180 00:06:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.180 00:06:35 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:10.180 00:06:35 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:10.180 00:06:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.180 00:06:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.180 00:06:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:10.180 00:06:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:10.180 00:06:35 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:10.180 00:06:35 -- nvmf/run.sh@29 -- # port=4424 00:08:10.180 00:06:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:10.181 00:06:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:10.181 00:06:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.181 00:06:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:10.181 [2024-11-30 00:06:35.700988] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:10.181 [2024-11-30 00:06:35.701065] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2731985 ] 00:08:10.181 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.439 [2024-11-30 00:06:35.878334] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.439 [2024-11-30 00:06:35.946658] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.439 [2024-11-30 00:06:35.946784] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.698 [2024-11-30 00:06:36.005163] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.698 [2024-11-30 00:06:36.021557] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:10.698 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.698 INFO: Seed: 2607544057 00:08:10.698 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:10.698 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:10.698 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:10.698 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.698 #2 INITED exec/s: 0 rss: 60Mb 00:08:10.698 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.698 This may also happen if the target rejected all inputs we tried so far 00:08:10.698 [2024-11-30 00:06:36.070374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.698 [2024-11-30 00:06:36.070407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.698 [2024-11-30 00:06:36.070475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.698 [2024-11-30 00:06:36.070498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.698 [2024-11-30 00:06:36.070562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.698 [2024-11-30 00:06:36.070584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.698 [2024-11-30 00:06:36.070654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.698 [2024-11-30 00:06:36.070674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.959 NEW_FUNC[1/672]: 0x466208 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:10.959 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.959 #9 NEW cov: 11666 ft: 11667 corp: 2/83b lim: 100 exec/s: 0 rss: 68Mb L: 82/82 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:10.959 [2024-11-30 00:06:36.391094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.391129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.391197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.391219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.391283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.391308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.391371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.391389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.959 #10 NEW cov: 11783 ft: 12122 corp: 3/165b lim: 100 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 ChangeBinInt- 00:08:10.959 [2024-11-30 00:06:36.441192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.441221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.441286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.441307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.441370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.441389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.441450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18443635707093188607 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.441468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.959 #11 NEW cov: 11789 ft: 12327 corp: 4/255b lim: 100 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:10.959 [2024-11-30 00:06:36.481261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.481290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.481347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.481368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.481434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.481454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.959 [2024-11-30 00:06:36.481517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.959 [2024-11-30 00:06:36.481536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.959 #12 NEW cov: 11874 ft: 12722 corp: 5/345b lim: 100 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:11.219 [2024-11-30 00:06:36.521398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.521428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.521484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.521510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.521576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.521594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.521666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.521686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.219 #13 NEW cov: 11874 ft: 12813 corp: 6/444b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:11.219 [2024-11-30 00:06:36.561520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.561549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.561613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.561635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.561693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.561715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.561780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.561799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.219 #14 NEW cov: 11874 ft: 12915 corp: 7/534b lim: 100 exec/s: 0 rss: 68Mb L: 90/99 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:11.219 [2024-11-30 00:06:36.601620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.601649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.601706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.601727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.601792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.601809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.601873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.601892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.219 #15 NEW cov: 11874 ft: 12961 corp: 8/624b lim: 100 exec/s: 0 rss: 68Mb L: 90/99 MS: 1 ChangeByte- 00:08:11.219 [2024-11-30 00:06:36.641572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.641604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.641684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.641707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.641773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.641795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.219 #16 NEW cov: 11874 ft: 13330 corp: 9/703b lim: 100 exec/s: 0 rss: 68Mb L: 79/99 MS: 1 EraseBytes- 00:08:11.219 [2024-11-30 00:06:36.681839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.681867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.681929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.681952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.682015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.682033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.682088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.682107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.219 #17 NEW cov: 11874 ft: 13370 corp: 10/785b lim: 100 exec/s: 0 rss: 68Mb L: 82/99 MS: 1 CopyPart- 00:08:11.219 [2024-11-30 00:06:36.721960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.721988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.722048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.722069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.722131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.722152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.219 [2024-11-30 00:06:36.722217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.219 [2024-11-30 00:06:36.722238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.219 #18 NEW cov: 11874 ft: 13434 corp: 11/876b lim: 100 exec/s: 0 rss: 68Mb L: 91/99 MS: 1 CrossOver- 00:08:11.219 [2024-11-30 00:06:36.762085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.220 [2024-11-30 00:06:36.762114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.220 [2024-11-30 00:06:36.762175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.220 [2024-11-30 00:06:36.762196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.220 [2024-11-30 00:06:36.762258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.220 [2024-11-30 00:06:36.762277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.220 [2024-11-30 00:06:36.762339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.220 [2024-11-30 00:06:36.762357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.478 #19 NEW cov: 11874 ft: 13510 corp: 12/966b lim: 100 exec/s: 0 rss: 68Mb L: 90/99 MS: 1 ChangeBit- 00:08:11.478 [2024-11-30 00:06:36.802184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.802212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.802273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.802294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.802357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.802376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.802439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.802458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.478 #20 NEW cov: 11874 ft: 13535 corp: 13/1049b lim: 100 exec/s: 0 rss: 68Mb L: 83/99 MS: 1 InsertByte- 00:08:11.478 [2024-11-30 00:06:36.842439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.842468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.842528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.842551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.842619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.842641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.842709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.842730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.842793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073524278527 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.842810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:11.478 #21 NEW cov: 11874 ft: 13593 corp: 14/1149b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:11.478 [2024-11-30 00:06:36.882428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.882457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.882521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.882543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.882610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.882629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.882693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.882711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.478 #22 NEW cov: 11874 ft: 13690 corp: 15/1232b lim: 100 exec/s: 0 rss: 68Mb L: 83/100 MS: 1 CopyPart- 00:08:11.478 [2024-11-30 00:06:36.922519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.922548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.922614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:720575940379279104 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.922637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.922716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.922739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.478 [2024-11-30 00:06:36.922805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.478 [2024-11-30 00:06:36.922825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.478 #23 NEW cov: 11874 ft: 13711 corp: 16/1314b lim: 100 exec/s: 0 rss: 69Mb L: 82/100 MS: 1 ChangeBinInt- 00:08:11.479 [2024-11-30 00:06:36.962698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.479 [2024-11-30 00:06:36.962726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.479 [2024-11-30 00:06:36.962808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10664522830986608639 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.479 [2024-11-30 00:06:36.962832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.479 [2024-11-30 00:06:36.962895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.479 [2024-11-30 00:06:36.962914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.479 [2024-11-30 00:06:36.962975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.479 [2024-11-30 00:06:36.962994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.479 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.479 #24 NEW cov: 11897 ft: 13747 corp: 17/1413b lim: 100 exec/s: 0 rss: 69Mb L: 99/100 MS: 1 CrossOver- 00:08:11.479 [2024-11-30 00:06:37.012504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.479 [2024-11-30 00:06:37.012532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.479 [2024-11-30 00:06:37.012602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.479 [2024-11-30 00:06:37.012624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.738 #25 NEW cov: 11897 ft: 14148 corp: 18/1467b lim: 100 exec/s: 0 rss: 69Mb L: 54/100 MS: 1 EraseBytes- 00:08:11.738 [2024-11-30 00:06:37.052955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.052984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.053045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8863084066665136127 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.053066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.053131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.053153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.053216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.053235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.738 #26 NEW cov: 11897 ft: 14233 corp: 19/1558b lim: 100 exec/s: 26 rss: 69Mb L: 91/100 MS: 1 InsertByte- 00:08:11.738 [2024-11-30 00:06:37.092946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.092975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.093040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.093065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.093128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.093149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.738 #27 NEW cov: 11897 ft: 14238 corp: 20/1628b lim: 100 exec/s: 27 rss: 69Mb L: 70/100 MS: 1 EraseBytes- 00:08:11.738 [2024-11-30 00:06:37.133216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.133245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.133308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.133330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.133394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.133413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.133476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.133495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.738 #28 NEW cov: 11897 ft: 14242 corp: 21/1726b lim: 100 exec/s: 28 rss: 69Mb L: 98/100 MS: 1 CopyPart- 00:08:11.738 [2024-11-30 00:06:37.173338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.173367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.173429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8863084066665136127 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.738 [2024-11-30 00:06:37.173452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.738 [2024-11-30 00:06:37.173518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.173543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.739 [2024-11-30 00:06:37.173611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.173630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.739 #29 NEW cov: 11897 ft: 14268 corp: 22/1818b lim: 100 exec/s: 29 rss: 69Mb L: 92/100 MS: 1 InsertByte- 00:08:11.739 [2024-11-30 00:06:37.213454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.213484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.739 [2024-11-30 00:06:37.213558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8863084066665136127 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.213581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.739 [2024-11-30 00:06:37.213650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.213673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.739 [2024-11-30 00:06:37.213748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.213767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.739 #30 NEW cov: 11897 ft: 14273 corp: 23/1911b lim: 100 exec/s: 30 rss: 69Mb L: 93/100 MS: 1 InsertByte- 00:08:11.739 [2024-11-30 00:06:37.253116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.739 [2024-11-30 00:06:37.253145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.739 #34 NEW cov: 11897 ft: 15136 corp: 24/1937b lim: 100 exec/s: 34 rss: 69Mb L: 26/100 MS: 4 CrossOver-ChangeBit-CrossOver-CopyPart- 00:08:11.998 [2024-11-30 00:06:37.303692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.303722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.303785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.303808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.303870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.303888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.303951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.303969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.998 #35 NEW cov: 11897 ft: 15189 corp: 25/2028b lim: 100 exec/s: 35 rss: 69Mb L: 91/100 MS: 1 InsertRepeatedBytes- 00:08:11.998 [2024-11-30 00:06:37.343810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.343840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.343906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.343927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.343992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.344010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.344076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.344096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.998 #36 NEW cov: 11897 ft: 15210 corp: 26/2111b lim: 100 exec/s: 36 rss: 69Mb L: 83/100 MS: 1 ChangeBit- 00:08:11.998 [2024-11-30 00:06:37.383937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.383967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.384030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.384051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.384115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.384134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.384198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.384218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.998 #37 NEW cov: 11897 ft: 15278 corp: 27/2194b lim: 100 exec/s: 37 rss: 69Mb L: 83/100 MS: 1 InsertByte- 00:08:11.998 [2024-11-30 00:06:37.423890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.423919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.424000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.424023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.424089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.424112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.998 #38 NEW cov: 11897 ft: 15310 corp: 28/2263b lim: 100 exec/s: 38 rss: 69Mb L: 69/100 MS: 1 EraseBytes- 00:08:11.998 [2024-11-30 00:06:37.464132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.464162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.464224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10664522830986608639 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.464245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.464309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.464336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.464399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.464416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.998 #39 NEW cov: 11897 ft: 15318 corp: 29/2362b lim: 100 exec/s: 39 rss: 70Mb L: 99/100 MS: 1 ChangeBinInt- 00:08:11.998 [2024-11-30 00:06:37.504245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.504274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.504339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.998 [2024-11-30 00:06:37.504360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.998 [2024-11-30 00:06:37.504423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744069431361535 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.999 [2024-11-30 00:06:37.504442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.999 [2024-11-30 00:06:37.504505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.999 [2024-11-30 00:06:37.504524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.999 #40 NEW cov: 11897 ft: 15355 corp: 30/2444b lim: 100 exec/s: 40 rss: 70Mb L: 82/100 MS: 1 CMP- DE: "\006\000"- 00:08:11.999 [2024-11-30 00:06:37.544361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.999 [2024-11-30 00:06:37.544391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.999 [2024-11-30 00:06:37.544450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.999 [2024-11-30 00:06:37.544471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.999 [2024-11-30 00:06:37.544535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.999 [2024-11-30 00:06:37.544553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.999 [2024-11-30 00:06:37.544623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.999 [2024-11-30 00:06:37.544646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.258 #41 NEW cov: 11897 ft: 15375 corp: 31/2526b lim: 100 exec/s: 41 rss: 70Mb L: 82/100 MS: 1 EraseBytes- 00:08:12.258 [2024-11-30 00:06:37.584507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.584537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.584603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:432627039204278271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.584628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.584694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.584714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.584777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.584796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.258 #42 NEW cov: 11897 ft: 15393 corp: 32/2609b lim: 100 exec/s: 42 rss: 70Mb L: 83/100 MS: 1 PersAutoDict- DE: "\006\000"- 00:08:12.258 [2024-11-30 00:06:37.624666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.624695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.624758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:432627039204278271 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.624780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.624841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.624860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.624922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.624941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.258 #43 NEW cov: 11897 ft: 15419 corp: 33/2695b lim: 100 exec/s: 43 rss: 70Mb L: 86/100 MS: 1 CrossOver- 00:08:12.258 [2024-11-30 00:06:37.664739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.664767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.664825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65024 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.664845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.664909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.664927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.664991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.665009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.258 #44 NEW cov: 11897 ft: 15433 corp: 34/2778b lim: 100 exec/s: 44 rss: 70Mb L: 83/100 MS: 1 ChangeBit- 00:08:12.258 [2024-11-30 00:06:37.704583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.704619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.704688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446741874686296063 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.704710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.258 #50 NEW cov: 11897 ft: 15445 corp: 35/2829b lim: 100 exec/s: 50 rss: 70Mb L: 51/100 MS: 1 EraseBytes- 00:08:12.258 [2024-11-30 00:06:37.744529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.744558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.258 #51 NEW cov: 11897 ft: 15516 corp: 36/2856b lim: 100 exec/s: 51 rss: 70Mb L: 27/100 MS: 1 CopyPart- 00:08:12.258 [2024-11-30 00:06:37.784774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.784803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.258 [2024-11-30 00:06:37.784871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.258 [2024-11-30 00:06:37.784893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.258 #52 NEW cov: 11897 ft: 15521 corp: 37/2904b lim: 100 exec/s: 52 rss: 70Mb L: 48/100 MS: 1 EraseBytes- 00:08:12.519 [2024-11-30 00:06:37.825172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.825200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.825259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:216172782113783807 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.825279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.825344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.825364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.825426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.825445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.519 #53 NEW cov: 11897 ft: 15568 corp: 38/3001b lim: 100 exec/s: 53 rss: 70Mb L: 97/100 MS: 1 CopyPart- 00:08:12.519 [2024-11-30 00:06:37.865300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.865328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.865390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.865415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.865478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.865497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.865560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551420 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.865579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.519 #54 NEW cov: 11897 ft: 15582 corp: 39/3100b lim: 100 exec/s: 54 rss: 70Mb L: 99/100 MS: 1 InsertByte- 00:08:12.519 [2024-11-30 00:06:37.905443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.905471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.519 [2024-11-30 00:06:37.905530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8863084066665136127 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.519 [2024-11-30 00:06:37.905550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.905617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.905636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.905699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.905719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.520 #55 NEW cov: 11897 ft: 15594 corp: 40/3191b lim: 100 exec/s: 55 rss: 70Mb L: 91/100 MS: 1 ShuffleBytes- 00:08:12.520 [2024-11-30 00:06:37.945563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:8225 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.945590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.945649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.945670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.945734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.945754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.945816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.945836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.520 #56 NEW cov: 11897 ft: 15631 corp: 41/3282b lim: 100 exec/s: 56 rss: 70Mb L: 91/100 MS: 1 InsertRepeatedBytes- 00:08:12.520 [2024-11-30 00:06:37.985672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.985703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.985784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10664522830986608639 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.985807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.985874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.985897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:37.985963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:37.985982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.520 #57 NEW cov: 11897 ft: 15644 corp: 42/3381b lim: 100 exec/s: 57 rss: 70Mb L: 99/100 MS: 1 CrossOver- 00:08:12.520 [2024-11-30 00:06:38.025955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:38.025983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:38.026040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:38.026061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:38.026126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:38.026147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:38.026211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:38.026229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.520 [2024-11-30 00:06:38.026291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073524278527 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.520 [2024-11-30 00:06:38.026309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.520 #58 NEW cov: 11897 ft: 15649 corp: 43/3481b lim: 100 exec/s: 29 rss: 70Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:12.520 #58 DONE cov: 11897 ft: 15649 corp: 43/3481b lim: 100 exec/s: 29 rss: 70Mb 00:08:12.520 ###### Recommended dictionary. ###### 00:08:12.520 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:12.520 "\006\000" # Uses: 1 00:08:12.520 ###### End of recommended dictionary. ###### 00:08:12.520 Done 58 runs in 2 second(s) 00:08:12.780 00:06:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:12.780 00:06:38 -- ../common.sh@72 -- # (( i++ )) 00:08:12.780 00:06:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.780 00:06:38 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:12.780 00:08:12.780 real 1m5.468s 00:08:12.780 user 1m40.779s 00:08:12.780 sys 0m8.264s 00:08:12.780 00:06:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:12.780 00:06:38 -- common/autotest_common.sh@10 -- # set +x 00:08:12.780 ************************************ 00:08:12.780 END TEST nvmf_fuzz 00:08:12.780 ************************************ 00:08:12.780 00:06:38 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:12.780 00:06:38 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:12.780 00:06:38 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:12.780 00:06:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:12.780 00:06:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:12.780 00:06:38 -- common/autotest_common.sh@10 -- # set +x 00:08:12.780 ************************************ 00:08:12.780 START TEST vfio_fuzz 00:08:12.780 ************************************ 00:08:12.780 00:06:38 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:12.780 * Looking for test storage... 00:08:12.780 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:12.780 00:06:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:12.780 00:06:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:12.780 00:06:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:13.040 00:06:38 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:13.040 00:06:38 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:13.040 00:06:38 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:13.040 00:06:38 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:13.040 00:06:38 -- scripts/common.sh@335 -- # IFS=.-: 00:08:13.040 00:06:38 -- scripts/common.sh@335 -- # read -ra ver1 00:08:13.040 00:06:38 -- scripts/common.sh@336 -- # IFS=.-: 00:08:13.040 00:06:38 -- scripts/common.sh@336 -- # read -ra ver2 00:08:13.040 00:06:38 -- scripts/common.sh@337 -- # local 'op=<' 00:08:13.040 00:06:38 -- scripts/common.sh@339 -- # ver1_l=2 00:08:13.040 00:06:38 -- scripts/common.sh@340 -- # ver2_l=1 00:08:13.040 00:06:38 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:13.040 00:06:38 -- scripts/common.sh@343 -- # case "$op" in 00:08:13.040 00:06:38 -- scripts/common.sh@344 -- # : 1 00:08:13.040 00:06:38 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:13.040 00:06:38 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:13.040 00:06:38 -- scripts/common.sh@364 -- # decimal 1 00:08:13.040 00:06:38 -- scripts/common.sh@352 -- # local d=1 00:08:13.040 00:06:38 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:13.040 00:06:38 -- scripts/common.sh@354 -- # echo 1 00:08:13.040 00:06:38 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:13.040 00:06:38 -- scripts/common.sh@365 -- # decimal 2 00:08:13.040 00:06:38 -- scripts/common.sh@352 -- # local d=2 00:08:13.040 00:06:38 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:13.040 00:06:38 -- scripts/common.sh@354 -- # echo 2 00:08:13.040 00:06:38 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:13.040 00:06:38 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:13.040 00:06:38 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:13.040 00:06:38 -- scripts/common.sh@367 -- # return 0 00:08:13.040 00:06:38 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:13.040 00:06:38 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:13.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.040 --rc genhtml_branch_coverage=1 00:08:13.040 --rc genhtml_function_coverage=1 00:08:13.040 --rc genhtml_legend=1 00:08:13.040 --rc geninfo_all_blocks=1 00:08:13.040 --rc geninfo_unexecuted_blocks=1 00:08:13.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.040 ' 00:08:13.040 00:06:38 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:13.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.040 --rc genhtml_branch_coverage=1 00:08:13.040 --rc genhtml_function_coverage=1 00:08:13.040 --rc genhtml_legend=1 00:08:13.040 --rc geninfo_all_blocks=1 00:08:13.040 --rc geninfo_unexecuted_blocks=1 00:08:13.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.040 ' 00:08:13.040 00:06:38 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:13.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.040 --rc genhtml_branch_coverage=1 00:08:13.040 --rc genhtml_function_coverage=1 00:08:13.040 --rc genhtml_legend=1 00:08:13.040 --rc geninfo_all_blocks=1 00:08:13.040 --rc geninfo_unexecuted_blocks=1 00:08:13.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.040 ' 00:08:13.040 00:06:38 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:13.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.040 --rc genhtml_branch_coverage=1 00:08:13.040 --rc genhtml_function_coverage=1 00:08:13.040 --rc genhtml_legend=1 00:08:13.040 --rc geninfo_all_blocks=1 00:08:13.040 --rc geninfo_unexecuted_blocks=1 00:08:13.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.040 ' 00:08:13.040 00:06:38 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:13.040 00:06:38 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:13.040 00:06:38 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:13.040 00:06:38 -- common/autotest_common.sh@34 -- # set -e 00:08:13.040 00:06:38 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:13.040 00:06:38 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:13.040 00:06:38 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:13.040 00:06:38 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:13.040 00:06:38 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:13.040 00:06:38 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:13.040 00:06:38 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:13.040 00:06:38 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:13.040 00:06:38 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:13.040 00:06:38 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:13.040 00:06:38 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:13.040 00:06:38 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:13.040 00:06:38 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:13.040 00:06:38 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:13.040 00:06:38 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:13.040 00:06:38 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:13.040 00:06:38 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:13.040 00:06:38 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:13.040 00:06:38 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:13.040 00:06:38 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:13.040 00:06:38 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:13.040 00:06:38 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:13.040 00:06:38 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:13.040 00:06:38 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:13.040 00:06:38 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:13.040 00:06:38 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:13.040 00:06:38 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:13.040 00:06:38 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:13.040 00:06:38 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:13.040 00:06:38 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:13.040 00:06:38 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:13.040 00:06:38 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:13.040 00:06:38 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:13.040 00:06:38 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:13.040 00:06:38 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:13.040 00:06:38 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:13.040 00:06:38 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:13.040 00:06:38 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:13.040 00:06:38 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:13.040 00:06:38 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:13.040 00:06:38 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:13.040 00:06:38 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:13.040 00:06:38 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:13.040 00:06:38 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:13.040 00:06:38 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:13.040 00:06:38 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:13.040 00:06:38 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:13.040 00:06:38 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:13.040 00:06:38 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:13.040 00:06:38 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:13.040 00:06:38 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:13.040 00:06:38 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:13.040 00:06:38 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:13.040 00:06:38 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:13.040 00:06:38 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:13.040 00:06:38 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:13.040 00:06:38 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:13.040 00:06:38 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:13.040 00:06:38 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:13.040 00:06:38 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:13.040 00:06:38 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:13.040 00:06:38 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:13.040 00:06:38 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:13.040 00:06:38 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:13.041 00:06:38 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:08:13.041 00:06:38 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:13.041 00:06:38 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:13.041 00:06:38 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:13.041 00:06:38 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:13.041 00:06:38 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:13.041 00:06:38 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:13.041 00:06:38 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:13.041 00:06:38 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:13.041 00:06:38 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:13.041 00:06:38 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:13.041 00:06:38 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:13.041 00:06:38 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:13.041 00:06:38 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:13.041 00:06:38 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:13.041 00:06:38 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:13.041 00:06:38 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:13.041 00:06:38 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:13.041 00:06:38 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:13.041 00:06:38 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:13.041 00:06:38 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:13.041 00:06:38 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:13.041 00:06:38 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:13.041 00:06:38 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:13.041 00:06:38 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:13.041 00:06:38 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:13.041 00:06:38 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:13.041 00:06:38 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:13.041 00:06:38 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:13.041 00:06:38 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:13.041 00:06:38 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:13.041 00:06:38 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:13.041 00:06:38 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:13.041 00:06:38 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:13.041 00:06:38 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:13.041 #define SPDK_CONFIG_H 00:08:13.041 #define SPDK_CONFIG_APPS 1 00:08:13.041 #define SPDK_CONFIG_ARCH native 00:08:13.041 #undef SPDK_CONFIG_ASAN 00:08:13.041 #undef SPDK_CONFIG_AVAHI 00:08:13.041 #undef SPDK_CONFIG_CET 00:08:13.041 #define SPDK_CONFIG_COVERAGE 1 00:08:13.041 #define SPDK_CONFIG_CROSS_PREFIX 00:08:13.041 #undef SPDK_CONFIG_CRYPTO 00:08:13.041 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:13.041 #undef SPDK_CONFIG_CUSTOMOCF 00:08:13.041 #undef SPDK_CONFIG_DAOS 00:08:13.041 #define SPDK_CONFIG_DAOS_DIR 00:08:13.041 #define SPDK_CONFIG_DEBUG 1 00:08:13.041 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:13.041 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:13.041 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:13.041 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:13.041 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:13.041 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:13.041 #define SPDK_CONFIG_EXAMPLES 1 00:08:13.041 #undef SPDK_CONFIG_FC 00:08:13.041 #define SPDK_CONFIG_FC_PATH 00:08:13.041 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:13.041 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:13.041 #undef SPDK_CONFIG_FUSE 00:08:13.041 #define SPDK_CONFIG_FUZZER 1 00:08:13.041 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:13.041 #undef SPDK_CONFIG_GOLANG 00:08:13.041 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:13.041 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:13.041 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:13.041 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:13.041 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:13.041 #define SPDK_CONFIG_IDXD 1 00:08:13.041 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:13.041 #undef SPDK_CONFIG_IPSEC_MB 00:08:13.041 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:13.041 #define SPDK_CONFIG_ISAL 1 00:08:13.041 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:13.041 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:13.041 #define SPDK_CONFIG_LIBDIR 00:08:13.041 #undef SPDK_CONFIG_LTO 00:08:13.041 #define SPDK_CONFIG_MAX_LCORES 00:08:13.041 #define SPDK_CONFIG_NVME_CUSE 1 00:08:13.041 #undef SPDK_CONFIG_OCF 00:08:13.041 #define SPDK_CONFIG_OCF_PATH 00:08:13.041 #define SPDK_CONFIG_OPENSSL_PATH 00:08:13.041 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:13.041 #undef SPDK_CONFIG_PGO_USE 00:08:13.041 #define SPDK_CONFIG_PREFIX /usr/local 00:08:13.041 #undef SPDK_CONFIG_RAID5F 00:08:13.041 #undef SPDK_CONFIG_RBD 00:08:13.041 #define SPDK_CONFIG_RDMA 1 00:08:13.041 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:13.041 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:13.041 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:13.041 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:13.041 #undef SPDK_CONFIG_SHARED 00:08:13.041 #undef SPDK_CONFIG_SMA 00:08:13.041 #define SPDK_CONFIG_TESTS 1 00:08:13.041 #undef SPDK_CONFIG_TSAN 00:08:13.041 #define SPDK_CONFIG_UBLK 1 00:08:13.041 #define SPDK_CONFIG_UBSAN 1 00:08:13.041 #undef SPDK_CONFIG_UNIT_TESTS 00:08:13.041 #undef SPDK_CONFIG_URING 00:08:13.041 #define SPDK_CONFIG_URING_PATH 00:08:13.041 #undef SPDK_CONFIG_URING_ZNS 00:08:13.041 #undef SPDK_CONFIG_USDT 00:08:13.041 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:13.041 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:13.041 #define SPDK_CONFIG_VFIO_USER 1 00:08:13.041 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:13.041 #define SPDK_CONFIG_VHOST 1 00:08:13.041 #define SPDK_CONFIG_VIRTIO 1 00:08:13.041 #undef SPDK_CONFIG_VTUNE 00:08:13.041 #define SPDK_CONFIG_VTUNE_DIR 00:08:13.041 #define SPDK_CONFIG_WERROR 1 00:08:13.041 #define SPDK_CONFIG_WPDK_DIR 00:08:13.041 #undef SPDK_CONFIG_XNVME 00:08:13.041 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:13.041 00:06:38 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:13.041 00:06:38 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:13.041 00:06:38 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:13.041 00:06:38 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:13.041 00:06:38 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:13.041 00:06:38 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.041 00:06:38 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.041 00:06:38 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.041 00:06:38 -- paths/export.sh@5 -- # export PATH 00:08:13.041 00:06:38 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.041 00:06:38 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:13.041 00:06:38 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:13.041 00:06:38 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:13.041 00:06:38 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:13.041 00:06:38 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:13.041 00:06:38 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:13.041 00:06:38 -- pm/common@16 -- # TEST_TAG=N/A 00:08:13.041 00:06:38 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:13.041 00:06:38 -- common/autotest_common.sh@52 -- # : 1 00:08:13.041 00:06:38 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:13.041 00:06:38 -- common/autotest_common.sh@56 -- # : 0 00:08:13.041 00:06:38 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:13.041 00:06:38 -- common/autotest_common.sh@58 -- # : 0 00:08:13.041 00:06:38 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:13.041 00:06:38 -- common/autotest_common.sh@60 -- # : 1 00:08:13.041 00:06:38 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:13.041 00:06:38 -- common/autotest_common.sh@62 -- # : 0 00:08:13.041 00:06:38 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:13.041 00:06:38 -- common/autotest_common.sh@64 -- # : 00:08:13.041 00:06:38 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:13.041 00:06:38 -- common/autotest_common.sh@66 -- # : 0 00:08:13.041 00:06:38 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:13.041 00:06:38 -- common/autotest_common.sh@68 -- # : 0 00:08:13.041 00:06:38 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:13.041 00:06:38 -- common/autotest_common.sh@70 -- # : 0 00:08:13.041 00:06:38 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:13.042 00:06:38 -- common/autotest_common.sh@72 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:13.042 00:06:38 -- common/autotest_common.sh@74 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:13.042 00:06:38 -- common/autotest_common.sh@76 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:13.042 00:06:38 -- common/autotest_common.sh@78 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:13.042 00:06:38 -- common/autotest_common.sh@80 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:13.042 00:06:38 -- common/autotest_common.sh@82 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:13.042 00:06:38 -- common/autotest_common.sh@84 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:13.042 00:06:38 -- common/autotest_common.sh@86 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:13.042 00:06:38 -- common/autotest_common.sh@88 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:13.042 00:06:38 -- common/autotest_common.sh@90 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:13.042 00:06:38 -- common/autotest_common.sh@92 -- # : 1 00:08:13.042 00:06:38 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:13.042 00:06:38 -- common/autotest_common.sh@94 -- # : 1 00:08:13.042 00:06:38 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:13.042 00:06:38 -- common/autotest_common.sh@96 -- # : rdma 00:08:13.042 00:06:38 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:13.042 00:06:38 -- common/autotest_common.sh@98 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:13.042 00:06:38 -- common/autotest_common.sh@100 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:13.042 00:06:38 -- common/autotest_common.sh@102 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:13.042 00:06:38 -- common/autotest_common.sh@104 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:13.042 00:06:38 -- common/autotest_common.sh@106 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:13.042 00:06:38 -- common/autotest_common.sh@108 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:13.042 00:06:38 -- common/autotest_common.sh@110 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:13.042 00:06:38 -- common/autotest_common.sh@112 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:13.042 00:06:38 -- common/autotest_common.sh@114 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:13.042 00:06:38 -- common/autotest_common.sh@116 -- # : 1 00:08:13.042 00:06:38 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:13.042 00:06:38 -- common/autotest_common.sh@118 -- # : 00:08:13.042 00:06:38 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:13.042 00:06:38 -- common/autotest_common.sh@120 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:13.042 00:06:38 -- common/autotest_common.sh@122 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:13.042 00:06:38 -- common/autotest_common.sh@124 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:13.042 00:06:38 -- common/autotest_common.sh@126 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:13.042 00:06:38 -- common/autotest_common.sh@128 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:13.042 00:06:38 -- common/autotest_common.sh@130 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:13.042 00:06:38 -- common/autotest_common.sh@132 -- # : 00:08:13.042 00:06:38 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:13.042 00:06:38 -- common/autotest_common.sh@134 -- # : true 00:08:13.042 00:06:38 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:13.042 00:06:38 -- common/autotest_common.sh@136 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:13.042 00:06:38 -- common/autotest_common.sh@138 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:13.042 00:06:38 -- common/autotest_common.sh@140 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:13.042 00:06:38 -- common/autotest_common.sh@142 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:13.042 00:06:38 -- common/autotest_common.sh@144 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:13.042 00:06:38 -- common/autotest_common.sh@146 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:13.042 00:06:38 -- common/autotest_common.sh@148 -- # : 00:08:13.042 00:06:38 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:13.042 00:06:38 -- common/autotest_common.sh@150 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:13.042 00:06:38 -- common/autotest_common.sh@152 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:13.042 00:06:38 -- common/autotest_common.sh@154 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:13.042 00:06:38 -- common/autotest_common.sh@156 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:13.042 00:06:38 -- common/autotest_common.sh@158 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:13.042 00:06:38 -- common/autotest_common.sh@160 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:13.042 00:06:38 -- common/autotest_common.sh@163 -- # : 00:08:13.042 00:06:38 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:13.042 00:06:38 -- common/autotest_common.sh@165 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:13.042 00:06:38 -- common/autotest_common.sh@167 -- # : 0 00:08:13.042 00:06:38 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:13.042 00:06:38 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:13.042 00:06:38 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:13.042 00:06:38 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:13.042 00:06:38 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:13.042 00:06:38 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:13.042 00:06:38 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:13.042 00:06:38 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:13.042 00:06:38 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:13.042 00:06:38 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:13.042 00:06:38 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:13.042 00:06:38 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:13.042 00:06:38 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:13.042 00:06:38 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:13.042 00:06:38 -- common/autotest_common.sh@196 -- # cat 00:08:13.042 00:06:38 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:13.042 00:06:38 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:13.042 00:06:38 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:13.042 00:06:38 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:13.043 00:06:38 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:13.043 00:06:38 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:13.043 00:06:38 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:13.043 00:06:38 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:13.043 00:06:38 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:13.043 00:06:38 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:13.043 00:06:38 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:13.043 00:06:38 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:13.043 00:06:38 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:13.043 00:06:38 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:13.043 00:06:38 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:13.043 00:06:38 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:13.043 00:06:38 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:13.043 00:06:38 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:13.043 00:06:38 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:13.043 00:06:38 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:13.043 00:06:38 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:13.043 00:06:38 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:13.043 00:06:38 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:13.043 00:06:38 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:13.043 00:06:38 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:13.043 00:06:38 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:13.043 00:06:38 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:13.043 00:06:38 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:13.043 00:06:38 -- common/autotest_common.sh@259 -- # valgrind= 00:08:13.043 00:06:38 -- common/autotest_common.sh@265 -- # uname -s 00:08:13.043 00:06:38 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:13.043 00:06:38 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:13.043 00:06:38 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:13.043 00:06:38 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:13.043 00:06:38 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:13.043 00:06:38 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:13.043 00:06:38 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:13.043 00:06:38 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:13.043 00:06:38 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:13.043 00:06:38 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:13.043 00:06:38 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:13.043 00:06:38 -- common/autotest_common.sh@319 -- # [[ -z 2732460 ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@319 -- # kill -0 2732460 00:08:13.043 00:06:38 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:13.043 00:06:38 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:13.043 00:06:38 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:13.043 00:06:38 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:13.043 00:06:38 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:13.043 00:06:38 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:13.043 00:06:38 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:13.043 00:06:38 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.liGbar 00:08:13.043 00:06:38 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:13.043 00:06:38 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.liGbar/tests/vfio /tmp/spdk.liGbar 00:08:13.043 00:06:38 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@328 -- # df -T 00:08:13.043 00:06:38 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=53324050432 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=8406556672 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863339520 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=1966080 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:13.043 00:06:38 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:13.043 00:06:38 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:13.043 00:06:38 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:13.043 00:06:38 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:13.043 * Looking for test storage... 00:08:13.043 00:06:38 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:13.043 00:06:38 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:13.043 00:06:38 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:13.043 00:06:38 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:13.043 00:06:38 -- common/autotest_common.sh@373 -- # mount=/ 00:08:13.043 00:06:38 -- common/autotest_common.sh@375 -- # target_space=53324050432 00:08:13.043 00:06:38 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:13.043 00:06:38 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:13.043 00:06:38 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:13.043 00:06:38 -- common/autotest_common.sh@382 -- # new_size=10621149184 00:08:13.043 00:06:38 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:13.043 00:06:38 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:13.043 00:06:38 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:13.043 00:06:38 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:13.043 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:13.043 00:06:38 -- common/autotest_common.sh@390 -- # return 0 00:08:13.043 00:06:38 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:13.043 00:06:38 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:13.043 00:06:38 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:13.043 00:06:38 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:13.043 00:06:38 -- common/autotest_common.sh@1682 -- # true 00:08:13.043 00:06:38 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:13.043 00:06:38 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:13.044 00:06:38 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:13.044 00:06:38 -- common/autotest_common.sh@27 -- # exec 00:08:13.044 00:06:38 -- common/autotest_common.sh@29 -- # exec 00:08:13.044 00:06:38 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:13.044 00:06:38 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:13.044 00:06:38 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:13.044 00:06:38 -- common/autotest_common.sh@18 -- # set -x 00:08:13.044 00:06:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:13.044 00:06:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:13.044 00:06:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:13.044 00:06:38 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:13.044 00:06:38 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:13.044 00:06:38 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:13.044 00:06:38 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:13.044 00:06:38 -- scripts/common.sh@335 -- # IFS=.-: 00:08:13.044 00:06:38 -- scripts/common.sh@335 -- # read -ra ver1 00:08:13.044 00:06:38 -- scripts/common.sh@336 -- # IFS=.-: 00:08:13.044 00:06:38 -- scripts/common.sh@336 -- # read -ra ver2 00:08:13.044 00:06:38 -- scripts/common.sh@337 -- # local 'op=<' 00:08:13.044 00:06:38 -- scripts/common.sh@339 -- # ver1_l=2 00:08:13.044 00:06:38 -- scripts/common.sh@340 -- # ver2_l=1 00:08:13.044 00:06:38 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:13.044 00:06:38 -- scripts/common.sh@343 -- # case "$op" in 00:08:13.044 00:06:38 -- scripts/common.sh@344 -- # : 1 00:08:13.044 00:06:38 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:13.044 00:06:38 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:13.044 00:06:38 -- scripts/common.sh@364 -- # decimal 1 00:08:13.044 00:06:38 -- scripts/common.sh@352 -- # local d=1 00:08:13.044 00:06:38 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:13.044 00:06:38 -- scripts/common.sh@354 -- # echo 1 00:08:13.044 00:06:38 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:13.044 00:06:38 -- scripts/common.sh@365 -- # decimal 2 00:08:13.044 00:06:38 -- scripts/common.sh@352 -- # local d=2 00:08:13.044 00:06:38 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:13.044 00:06:38 -- scripts/common.sh@354 -- # echo 2 00:08:13.303 00:06:38 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:13.303 00:06:38 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:13.303 00:06:38 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:13.303 00:06:38 -- scripts/common.sh@367 -- # return 0 00:08:13.303 00:06:38 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:13.303 00:06:38 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:13.303 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.303 --rc genhtml_branch_coverage=1 00:08:13.303 --rc genhtml_function_coverage=1 00:08:13.303 --rc genhtml_legend=1 00:08:13.303 --rc geninfo_all_blocks=1 00:08:13.303 --rc geninfo_unexecuted_blocks=1 00:08:13.303 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.303 ' 00:08:13.303 00:06:38 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:13.303 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.303 --rc genhtml_branch_coverage=1 00:08:13.303 --rc genhtml_function_coverage=1 00:08:13.303 --rc genhtml_legend=1 00:08:13.303 --rc geninfo_all_blocks=1 00:08:13.303 --rc geninfo_unexecuted_blocks=1 00:08:13.303 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.303 ' 00:08:13.303 00:06:38 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:13.303 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.303 --rc genhtml_branch_coverage=1 00:08:13.303 --rc genhtml_function_coverage=1 00:08:13.303 --rc genhtml_legend=1 00:08:13.303 --rc geninfo_all_blocks=1 00:08:13.303 --rc geninfo_unexecuted_blocks=1 00:08:13.303 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.303 ' 00:08:13.303 00:06:38 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:13.303 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.303 --rc genhtml_branch_coverage=1 00:08:13.303 --rc genhtml_function_coverage=1 00:08:13.303 --rc genhtml_legend=1 00:08:13.303 --rc geninfo_all_blocks=1 00:08:13.303 --rc geninfo_unexecuted_blocks=1 00:08:13.303 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:13.303 ' 00:08:13.303 00:06:38 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:13.303 00:06:38 -- ../common.sh@8 -- # pids=() 00:08:13.303 00:06:38 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:13.303 00:06:38 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:13.303 00:06:38 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:13.303 00:06:38 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:13.303 00:06:38 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:13.303 00:06:38 -- vfio/run.sh@65 -- # mem_size=0 00:08:13.303 00:06:38 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:13.303 00:06:38 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:13.303 00:06:38 -- ../common.sh@69 -- # local fuzz_num=7 00:08:13.303 00:06:38 -- ../common.sh@70 -- # local time=1 00:08:13.303 00:06:38 -- ../common.sh@72 -- # (( i = 0 )) 00:08:13.303 00:06:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.303 00:06:38 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:13.303 00:06:38 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:13.303 00:06:38 -- vfio/run.sh@23 -- # local timen=1 00:08:13.303 00:06:38 -- vfio/run.sh@24 -- # local core=0x1 00:08:13.303 00:06:38 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:13.303 00:06:38 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:13.303 00:06:38 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:13.303 00:06:38 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:13.303 00:06:38 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:13.303 00:06:38 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:13.303 00:06:38 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:13.303 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:13.303 00:06:38 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:13.303 [2024-11-30 00:06:38.629841] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:13.303 [2024-11-30 00:06:38.629895] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2732538 ] 00:08:13.303 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.303 [2024-11-30 00:06:38.700500] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.303 [2024-11-30 00:06:38.774158] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.303 [2024-11-30 00:06:38.774294] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.562 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.562 INFO: Seed: 1237584886 00:08:13.562 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:13.562 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:13.562 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:13.562 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.562 #2 INITED exec/s: 0 rss: 62Mb 00:08:13.562 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.562 This may also happen if the target rejected all inputs we tried so far 00:08:14.078 NEW_FUNC[1/631]: 0x43a218 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:14.078 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:14.078 #10 NEW cov: 10761 ft: 10569 corp: 2/7b lim: 60 exec/s: 0 rss: 67Mb L: 6/6 MS: 3 InsertByte-CopyPart-CopyPart- 00:08:14.336 #11 NEW cov: 10775 ft: 14774 corp: 3/13b lim: 60 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 ChangeByte- 00:08:14.594 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.594 #12 NEW cov: 10795 ft: 15712 corp: 4/19b lim: 60 exec/s: 0 rss: 70Mb L: 6/6 MS: 1 CopyPart- 00:08:14.594 #15 NEW cov: 10795 ft: 16071 corp: 5/54b lim: 60 exec/s: 15 rss: 70Mb L: 35/35 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:14.853 #23 NEW cov: 10795 ft: 16216 corp: 6/109b lim: 60 exec/s: 23 rss: 70Mb L: 55/55 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:15.112 #24 NEW cov: 10795 ft: 16531 corp: 7/144b lim: 60 exec/s: 24 rss: 70Mb L: 35/55 MS: 1 CopyPart- 00:08:15.371 #28 NEW cov: 10795 ft: 16704 corp: 8/170b lim: 60 exec/s: 28 rss: 70Mb L: 26/55 MS: 4 InsertByte-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:15.630 #32 NEW cov: 10802 ft: 17058 corp: 9/207b lim: 60 exec/s: 32 rss: 70Mb L: 37/55 MS: 4 EraseBytes-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:15.630 #33 NEW cov: 10802 ft: 17298 corp: 10/233b lim: 60 exec/s: 16 rss: 70Mb L: 26/55 MS: 1 ChangeBit- 00:08:15.630 #33 DONE cov: 10802 ft: 17298 corp: 10/233b lim: 60 exec/s: 16 rss: 70Mb 00:08:15.630 Done 33 runs in 2 second(s) 00:08:15.889 00:06:41 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:15.889 00:06:41 -- ../common.sh@72 -- # (( i++ )) 00:08:15.889 00:06:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.889 00:06:41 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:15.889 00:06:41 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:15.889 00:06:41 -- vfio/run.sh@23 -- # local timen=1 00:08:15.889 00:06:41 -- vfio/run.sh@24 -- # local core=0x1 00:08:15.889 00:06:41 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:15.889 00:06:41 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:15.889 00:06:41 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:15.889 00:06:41 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:15.889 00:06:41 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:15.889 00:06:41 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:15.889 00:06:41 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:15.889 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:15.889 00:06:41 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:15.889 [2024-11-30 00:06:41.434418] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:15.889 [2024-11-30 00:06:41.434491] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2733066 ] 00:08:16.148 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.148 [2024-11-30 00:06:41.502536] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.148 [2024-11-30 00:06:41.571674] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:16.148 [2024-11-30 00:06:41.571809] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.406 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.406 INFO: Seed: 4032579947 00:08:16.406 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:16.406 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:16.406 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:16.406 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.406 #2 INITED exec/s: 0 rss: 62Mb 00:08:16.406 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.406 This may also happen if the target rejected all inputs we tried so far 00:08:16.406 [2024-11-30 00:06:41.869632] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:16.406 [2024-11-30 00:06:41.869665] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:16.406 [2024-11-30 00:06:41.869685] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:16.924 NEW_FUNC[1/638]: 0x43a7b8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:16.924 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:16.924 #12 NEW cov: 10782 ft: 10731 corp: 2/9b lim: 40 exec/s: 0 rss: 67Mb L: 8/8 MS: 5 ChangeByte-CopyPart-CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:16.924 [2024-11-30 00:06:42.332483] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:16.924 [2024-11-30 00:06:42.332517] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:16.924 [2024-11-30 00:06:42.332536] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:16.924 #13 NEW cov: 10796 ft: 13646 corp: 3/18b lim: 40 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertByte- 00:08:17.183 [2024-11-30 00:06:42.518892] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:17.183 [2024-11-30 00:06:42.518916] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:17.183 [2024-11-30 00:06:42.518936] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:17.183 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.183 #14 NEW cov: 10813 ft: 14724 corp: 4/38b lim: 40 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:08:17.183 [2024-11-30 00:06:42.706920] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:17.183 [2024-11-30 00:06:42.706944] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:17.183 [2024-11-30 00:06:42.706963] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:17.444 #15 NEW cov: 10813 ft: 15383 corp: 5/58b lim: 40 exec/s: 15 rss: 69Mb L: 20/20 MS: 1 ChangeByte- 00:08:17.444 [2024-11-30 00:06:42.893096] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:17.444 [2024-11-30 00:06:42.893118] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:17.444 [2024-11-30 00:06:42.893136] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:17.444 #16 NEW cov: 10813 ft: 15592 corp: 6/72b lim: 40 exec/s: 16 rss: 69Mb L: 14/20 MS: 1 EraseBytes- 00:08:17.703 [2024-11-30 00:06:43.079390] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:17.703 [2024-11-30 00:06:43.079413] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:17.703 [2024-11-30 00:06:43.079432] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:17.703 #17 NEW cov: 10813 ft: 15958 corp: 7/98b lim: 40 exec/s: 17 rss: 69Mb L: 26/26 MS: 1 CopyPart- 00:08:17.962 [2024-11-30 00:06:43.265838] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:17.962 [2024-11-30 00:06:43.265860] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:17.962 [2024-11-30 00:06:43.265879] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:17.962 #18 NEW cov: 10813 ft: 15982 corp: 8/124b lim: 40 exec/s: 18 rss: 69Mb L: 26/26 MS: 1 ChangeByte- 00:08:17.962 [2024-11-30 00:06:43.452436] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:17.962 [2024-11-30 00:06:43.452463] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:17.962 [2024-11-30 00:06:43.452482] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:18.219 #19 NEW cov: 10813 ft: 15999 corp: 9/139b lim: 40 exec/s: 19 rss: 69Mb L: 15/26 MS: 1 InsertByte- 00:08:18.219 [2024-11-30 00:06:43.637927] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:18.219 [2024-11-30 00:06:43.637950] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:18.219 [2024-11-30 00:06:43.637968] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:18.219 #20 NEW cov: 10820 ft: 16115 corp: 10/151b lim: 40 exec/s: 20 rss: 69Mb L: 12/26 MS: 1 EraseBytes- 00:08:18.478 [2024-11-30 00:06:43.823488] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:18.478 [2024-11-30 00:06:43.823511] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:18.478 [2024-11-30 00:06:43.823529] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:18.478 #21 NEW cov: 10820 ft: 16155 corp: 11/190b lim: 40 exec/s: 10 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:08:18.478 #21 DONE cov: 10820 ft: 16155 corp: 11/190b lim: 40 exec/s: 10 rss: 69Mb 00:08:18.478 Done 21 runs in 2 second(s) 00:08:18.738 00:06:44 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:18.738 00:06:44 -- ../common.sh@72 -- # (( i++ )) 00:08:18.738 00:06:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.738 00:06:44 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:18.738 00:06:44 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:18.738 00:06:44 -- vfio/run.sh@23 -- # local timen=1 00:08:18.738 00:06:44 -- vfio/run.sh@24 -- # local core=0x1 00:08:18.738 00:06:44 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:18.738 00:06:44 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:18.738 00:06:44 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:18.738 00:06:44 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:18.738 00:06:44 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:18.738 00:06:44 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:18.738 00:06:44 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:18.738 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:18.738 00:06:44 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:18.738 [2024-11-30 00:06:44.237697] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:18.738 [2024-11-30 00:06:44.237800] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2733613 ] 00:08:18.738 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.011 [2024-11-30 00:06:44.310631] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.011 [2024-11-30 00:06:44.378738] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:19.011 [2024-11-30 00:06:44.378874] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.011 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.011 INFO: Seed: 2542620184 00:08:19.269 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:19.269 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:19.269 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:19.269 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.269 #2 INITED exec/s: 0 rss: 62Mb 00:08:19.269 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.269 This may also happen if the target rejected all inputs we tried so far 00:08:19.269 [2024-11-30 00:06:44.695320] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:19.269 [2024-11-30 00:06:44.695367] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:19.835 NEW_FUNC[1/638]: 0x43b1a8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:19.835 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:19.835 #10 NEW cov: 10774 ft: 10742 corp: 2/10b lim: 80 exec/s: 0 rss: 67Mb L: 9/9 MS: 3 ChangeByte-ChangeBit-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:19.835 [2024-11-30 00:06:45.190010] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:19.835 [2024-11-30 00:06:45.190052] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:19.835 #11 NEW cov: 10788 ft: 14575 corp: 3/19b lim: 80 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CopyPart- 00:08:19.835 [2024-11-30 00:06:45.386189] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: cmd 5 failed: Invalid argument 00:08:19.835 [2024-11-30 00:06:45.386222] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:20.093 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:20.093 #12 NEW cov: 10805 ft: 15911 corp: 4/28b lim: 80 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ChangeByte- 00:08:20.093 [2024-11-30 00:06:45.584333] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:20.093 [2024-11-30 00:06:45.584363] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:20.352 #13 NEW cov: 10805 ft: 16180 corp: 5/72b lim: 80 exec/s: 13 rss: 70Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:20.352 [2024-11-30 00:06:45.780433] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:20.352 #14 NEW cov: 10806 ft: 16636 corp: 6/88b lim: 80 exec/s: 14 rss: 70Mb L: 16/44 MS: 1 InsertRepeatedBytes- 00:08:20.610 [2024-11-30 00:06:45.981251] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:20.610 [2024-11-30 00:06:45.981281] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:20.610 #15 NEW cov: 10806 ft: 17098 corp: 7/140b lim: 80 exec/s: 15 rss: 70Mb L: 52/52 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:20.868 [2024-11-30 00:06:46.175258] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:20.868 [2024-11-30 00:06:46.175288] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:20.868 #16 NEW cov: 10806 ft: 17409 corp: 8/157b lim: 80 exec/s: 16 rss: 70Mb L: 17/52 MS: 1 CopyPart- 00:08:20.868 [2024-11-30 00:06:46.371997] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:21.127 #17 NEW cov: 10813 ft: 17637 corp: 9/173b lim: 80 exec/s: 17 rss: 70Mb L: 16/52 MS: 1 ChangeByte- 00:08:21.127 [2024-11-30 00:06:46.573746] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:21.127 [2024-11-30 00:06:46.573776] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:21.385 #18 NEW cov: 10813 ft: 17884 corp: 10/217b lim: 80 exec/s: 9 rss: 70Mb L: 44/52 MS: 1 CopyPart- 00:08:21.385 #18 DONE cov: 10813 ft: 17884 corp: 10/217b lim: 80 exec/s: 9 rss: 70Mb 00:08:21.385 ###### Recommended dictionary. ###### 00:08:21.385 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:21.385 ###### End of recommended dictionary. ###### 00:08:21.385 Done 18 runs in 2 second(s) 00:08:21.644 00:06:46 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:21.644 00:06:46 -- ../common.sh@72 -- # (( i++ )) 00:08:21.644 00:06:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.644 00:06:46 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:21.644 00:06:46 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:21.644 00:06:46 -- vfio/run.sh@23 -- # local timen=1 00:08:21.644 00:06:46 -- vfio/run.sh@24 -- # local core=0x1 00:08:21.644 00:06:46 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:21.644 00:06:46 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:21.644 00:06:46 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:21.644 00:06:46 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:21.644 00:06:46 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:21.644 00:06:46 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:21.644 00:06:46 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:21.644 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:21.644 00:06:46 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:21.644 [2024-11-30 00:06:46.981457] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:21.644 [2024-11-30 00:06:46.981526] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2734160 ] 00:08:21.644 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.644 [2024-11-30 00:06:47.052650] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.644 [2024-11-30 00:06:47.120757] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:21.644 [2024-11-30 00:06:47.120897] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.902 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.902 INFO: Seed: 989657161 00:08:21.902 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:21.902 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:21.902 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:21.902 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.902 #2 INITED exec/s: 0 rss: 61Mb 00:08:21.902 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.902 This may also happen if the target rejected all inputs we tried so far 00:08:22.419 NEW_FUNC[1/632]: 0x43b898 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:22.419 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:22.419 #4 NEW cov: 10748 ft: 10712 corp: 2/34b lim: 320 exec/s: 0 rss: 68Mb L: 33/33 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:22.678 #6 NEW cov: 10762 ft: 14119 corp: 3/110b lim: 320 exec/s: 0 rss: 69Mb L: 76/76 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:22.678 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:22.678 #7 NEW cov: 10779 ft: 14988 corp: 4/187b lim: 320 exec/s: 0 rss: 70Mb L: 77/77 MS: 1 InsertByte- 00:08:22.936 #8 NEW cov: 10779 ft: 15690 corp: 5/264b lim: 320 exec/s: 8 rss: 70Mb L: 77/77 MS: 1 ChangeByte- 00:08:23.194 #9 NEW cov: 10779 ft: 16022 corp: 6/297b lim: 320 exec/s: 9 rss: 70Mb L: 33/77 MS: 1 ChangeBinInt- 00:08:23.453 #10 NEW cov: 10779 ft: 16093 corp: 7/330b lim: 320 exec/s: 10 rss: 70Mb L: 33/77 MS: 1 CopyPart- 00:08:23.453 #11 NEW cov: 10779 ft: 16331 corp: 8/363b lim: 320 exec/s: 11 rss: 70Mb L: 33/77 MS: 1 ChangeBit- 00:08:23.712 #12 NEW cov: 10786 ft: 16626 corp: 9/396b lim: 320 exec/s: 12 rss: 70Mb L: 33/77 MS: 1 ChangeBinInt- 00:08:23.970 #19 NEW cov: 10786 ft: 16655 corp: 10/428b lim: 320 exec/s: 9 rss: 70Mb L: 32/77 MS: 2 EraseBytes-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:23.970 #19 DONE cov: 10786 ft: 16655 corp: 10/428b lim: 320 exec/s: 9 rss: 70Mb 00:08:23.970 ###### Recommended dictionary. ###### 00:08:23.970 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:23.970 ###### End of recommended dictionary. ###### 00:08:23.970 Done 19 runs in 2 second(s) 00:08:24.369 00:06:49 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:24.369 00:06:49 -- ../common.sh@72 -- # (( i++ )) 00:08:24.369 00:06:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.369 00:06:49 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:24.369 00:06:49 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:24.369 00:06:49 -- vfio/run.sh@23 -- # local timen=1 00:08:24.369 00:06:49 -- vfio/run.sh@24 -- # local core=0x1 00:08:24.369 00:06:49 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:24.369 00:06:49 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:24.369 00:06:49 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:24.369 00:06:49 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:24.369 00:06:49 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:24.369 00:06:49 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:24.369 00:06:49 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:24.369 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:24.369 00:06:49 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:24.369 [2024-11-30 00:06:49.671337] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:24.369 [2024-11-30 00:06:49.671395] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2734521 ] 00:08:24.369 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.369 [2024-11-30 00:06:49.740320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.369 [2024-11-30 00:06:49.813099] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:24.369 [2024-11-30 00:06:49.813233] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.638 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.638 INFO: Seed: 3688662574 00:08:24.638 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:24.638 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:24.638 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:24.638 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.638 #2 INITED exec/s: 0 rss: 61Mb 00:08:24.638 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.638 This may also happen if the target rejected all inputs we tried so far 00:08:24.638 [2024-11-30 00:06:50.132303] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:24.638 [2024-11-30 00:06:50.132341] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:24.638 [2024-11-30 00:06:50.132352] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:24.638 [2024-11-30 00:06:50.132370] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:24.638 [2024-11-30 00:06:50.133327] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:24.638 [2024-11-30 00:06:50.133346] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:24.638 [2024-11-30 00:06:50.133362] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:25.155 NEW_FUNC[1/638]: 0x43c118 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:25.155 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:25.155 #8 NEW cov: 10781 ft: 10635 corp: 2/78b lim: 320 exec/s: 0 rss: 66Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:08:25.155 [2024-11-30 00:06:50.626526] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:25.155 [2024-11-30 00:06:50.626558] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:25.155 [2024-11-30 00:06:50.626570] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:25.155 [2024-11-30 00:06:50.626586] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:25.155 [2024-11-30 00:06:50.627525] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:25.155 [2024-11-30 00:06:50.627544] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:25.155 [2024-11-30 00:06:50.627560] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:25.413 #24 NEW cov: 10798 ft: 13481 corp: 3/156b lim: 320 exec/s: 0 rss: 67Mb L: 78/78 MS: 1 InsertByte- 00:08:25.413 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.413 #26 NEW cov: 10819 ft: 15167 corp: 4/209b lim: 320 exec/s: 0 rss: 68Mb L: 53/78 MS: 2 InsertRepeatedBytes-CopyPart- 00:08:25.671 [2024-11-30 00:06:51.006880] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:25.671 [2024-11-30 00:06:51.006904] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:25.671 [2024-11-30 00:06:51.006915] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:25.671 [2024-11-30 00:06:51.006931] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:25.671 [2024-11-30 00:06:51.007854] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:25.671 [2024-11-30 00:06:51.007874] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:25.671 [2024-11-30 00:06:51.007890] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:25.671 #27 NEW cov: 10819 ft: 15768 corp: 5/286b lim: 320 exec/s: 27 rss: 69Mb L: 77/78 MS: 1 CMP- DE: "\001\020"- 00:08:25.929 #28 NEW cov: 10819 ft: 16318 corp: 6/469b lim: 320 exec/s: 28 rss: 69Mb L: 183/183 MS: 1 InsertRepeatedBytes- 00:08:25.929 [2024-11-30 00:06:51.370160] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:25.929 [2024-11-30 00:06:51.370185] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:25.929 [2024-11-30 00:06:51.370196] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:25.929 [2024-11-30 00:06:51.370213] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:25.929 [2024-11-30 00:06:51.371187] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:25.929 [2024-11-30 00:06:51.371205] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:25.929 [2024-11-30 00:06:51.371221] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:25.929 #29 NEW cov: 10819 ft: 16676 corp: 7/564b lim: 320 exec/s: 29 rss: 69Mb L: 95/183 MS: 1 InsertRepeatedBytes- 00:08:26.187 [2024-11-30 00:06:51.553845] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:26.188 [2024-11-30 00:06:51.553873] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:26.188 [2024-11-30 00:06:51.553883] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:26.188 [2024-11-30 00:06:51.553899] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:26.188 [2024-11-30 00:06:51.554845] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:26.188 [2024-11-30 00:06:51.554864] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:26.188 [2024-11-30 00:06:51.554880] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:26.188 #30 NEW cov: 10819 ft: 17045 corp: 8/659b lim: 320 exec/s: 30 rss: 69Mb L: 95/183 MS: 1 ChangeBit- 00:08:26.188 [2024-11-30 00:06:51.737387] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:26.188 [2024-11-30 00:06:51.737410] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:26.188 [2024-11-30 00:06:51.737420] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:26.188 [2024-11-30 00:06:51.737436] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:26.188 [2024-11-30 00:06:51.738403] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:26.188 [2024-11-30 00:06:51.738423] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:26.188 [2024-11-30 00:06:51.738439] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:26.446 #31 NEW cov: 10826 ft: 17341 corp: 9/737b lim: 320 exec/s: 31 rss: 69Mb L: 78/183 MS: 1 ChangeByte- 00:08:26.705 #32 NEW cov: 10826 ft: 17626 corp: 10/791b lim: 320 exec/s: 32 rss: 69Mb L: 54/183 MS: 1 InsertByte- 00:08:26.705 [2024-11-30 00:06:52.102237] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:26.705 [2024-11-30 00:06:52.102262] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:26.705 [2024-11-30 00:06:52.102272] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:26.705 [2024-11-30 00:06:52.102289] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:26.705 [2024-11-30 00:06:52.103262] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:26.705 [2024-11-30 00:06:52.103281] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:26.705 [2024-11-30 00:06:52.103297] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:26.705 #33 NEW cov: 10826 ft: 17814 corp: 11/965b lim: 320 exec/s: 16 rss: 69Mb L: 174/183 MS: 1 CopyPart- 00:08:26.705 #33 DONE cov: 10826 ft: 17814 corp: 11/965b lim: 320 exec/s: 16 rss: 69Mb 00:08:26.705 ###### Recommended dictionary. ###### 00:08:26.705 "\001\020" # Uses: 0 00:08:26.705 ###### End of recommended dictionary. ###### 00:08:26.705 Done 33 runs in 2 second(s) 00:08:26.964 00:06:52 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:26.964 00:06:52 -- ../common.sh@72 -- # (( i++ )) 00:08:26.964 00:06:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.964 00:06:52 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:26.964 00:06:52 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:26.964 00:06:52 -- vfio/run.sh@23 -- # local timen=1 00:08:26.964 00:06:52 -- vfio/run.sh@24 -- # local core=0x1 00:08:26.964 00:06:52 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:26.964 00:06:52 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:26.964 00:06:52 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:26.964 00:06:52 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:26.964 00:06:52 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:26.964 00:06:52 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:26.964 00:06:52 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:26.964 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:26.964 00:06:52 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:26.964 [2024-11-30 00:06:52.508518] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:26.964 [2024-11-30 00:06:52.508609] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2735004 ] 00:08:27.223 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.223 [2024-11-30 00:06:52.581490] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.223 [2024-11-30 00:06:52.650034] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:27.223 [2024-11-30 00:06:52.650173] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.481 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.481 INFO: Seed: 2229687377 00:08:27.481 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:27.481 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:27.481 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:27.481 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.481 #2 INITED exec/s: 0 rss: 62Mb 00:08:27.481 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.481 This may also happen if the target rejected all inputs we tried so far 00:08:27.481 [2024-11-30 00:06:52.959641] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:27.481 [2024-11-30 00:06:52.959687] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:27.999 NEW_FUNC[1/638]: 0x43cb18 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:27.999 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:27.999 #22 NEW cov: 10781 ft: 10756 corp: 2/101b lim: 120 exec/s: 0 rss: 67Mb L: 100/100 MS: 5 ChangeByte-ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:27.999 [2024-11-30 00:06:53.423453] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:27.999 [2024-11-30 00:06:53.423498] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:27.999 #31 NEW cov: 10795 ft: 12842 corp: 3/115b lim: 120 exec/s: 0 rss: 68Mb L: 14/100 MS: 4 CopyPart-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:28.257 [2024-11-30 00:06:53.611549] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.257 [2024-11-30 00:06:53.611582] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.257 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.257 #32 NEW cov: 10812 ft: 14971 corp: 4/215b lim: 120 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:28.257 [2024-11-30 00:06:53.803425] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.257 [2024-11-30 00:06:53.803460] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.515 #33 NEW cov: 10815 ft: 15361 corp: 5/270b lim: 120 exec/s: 33 rss: 69Mb L: 55/100 MS: 1 EraseBytes- 00:08:28.515 [2024-11-30 00:06:53.993152] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.515 [2024-11-30 00:06:53.993184] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.775 #34 NEW cov: 10815 ft: 16105 corp: 6/325b lim: 120 exec/s: 34 rss: 69Mb L: 55/100 MS: 1 ChangeBinInt- 00:08:28.775 [2024-11-30 00:06:54.181236] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.775 [2024-11-30 00:06:54.181268] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.775 #45 NEW cov: 10815 ft: 16442 corp: 7/437b lim: 120 exec/s: 45 rss: 69Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:08:29.034 [2024-11-30 00:06:54.369851] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.034 [2024-11-30 00:06:54.369883] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.034 #46 NEW cov: 10815 ft: 16542 corp: 8/512b lim: 120 exec/s: 46 rss: 69Mb L: 75/112 MS: 1 InsertRepeatedBytes- 00:08:29.034 [2024-11-30 00:06:54.561864] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.034 [2024-11-30 00:06:54.561896] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.294 #47 NEW cov: 10822 ft: 17049 corp: 9/612b lim: 120 exec/s: 47 rss: 69Mb L: 100/112 MS: 1 ChangeBinInt- 00:08:29.294 [2024-11-30 00:06:54.752161] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.294 [2024-11-30 00:06:54.752192] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.553 #48 NEW cov: 10822 ft: 17095 corp: 10/725b lim: 120 exec/s: 24 rss: 69Mb L: 113/113 MS: 1 InsertByte- 00:08:29.553 #48 DONE cov: 10822 ft: 17095 corp: 10/725b lim: 120 exec/s: 24 rss: 69Mb 00:08:29.553 Done 48 runs in 2 second(s) 00:08:29.812 00:06:55 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:29.812 00:06:55 -- ../common.sh@72 -- # (( i++ )) 00:08:29.812 00:06:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.812 00:06:55 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:29.812 00:06:55 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:29.812 00:06:55 -- vfio/run.sh@23 -- # local timen=1 00:08:29.812 00:06:55 -- vfio/run.sh@24 -- # local core=0x1 00:08:29.812 00:06:55 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:29.812 00:06:55 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:29.812 00:06:55 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:29.812 00:06:55 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:29.812 00:06:55 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:29.812 00:06:55 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:29.812 00:06:55 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:29.812 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:29.812 00:06:55 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:29.812 [2024-11-30 00:06:55.160959] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:29.812 [2024-11-30 00:06:55.161052] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2735549 ] 00:08:29.812 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.812 [2024-11-30 00:06:55.232951] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.812 [2024-11-30 00:06:55.300800] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:29.812 [2024-11-30 00:06:55.300934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.070 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.070 INFO: Seed: 578724839 00:08:30.070 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:30.070 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:30.070 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:30.070 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.070 #2 INITED exec/s: 0 rss: 62Mb 00:08:30.070 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.070 This may also happen if the target rejected all inputs we tried so far 00:08:30.070 [2024-11-30 00:06:55.564637] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.070 [2024-11-30 00:06:55.564678] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.588 NEW_FUNC[1/638]: 0x43d808 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:30.588 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:30.588 #11 NEW cov: 10776 ft: 10738 corp: 2/38b lim: 90 exec/s: 0 rss: 68Mb L: 37/37 MS: 4 CrossOver-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:30.588 [2024-11-30 00:06:55.975537] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.588 [2024-11-30 00:06:55.975582] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.588 #13 NEW cov: 10790 ft: 14043 corp: 3/101b lim: 90 exec/s: 0 rss: 69Mb L: 63/63 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:30.588 [2024-11-30 00:06:56.109466] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.588 [2024-11-30 00:06:56.109503] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.846 #14 NEW cov: 10790 ft: 14909 corp: 4/138b lim: 90 exec/s: 0 rss: 70Mb L: 37/63 MS: 1 ChangeBinInt- 00:08:30.846 [2024-11-30 00:06:56.224541] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.846 [2024-11-30 00:06:56.224577] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.846 #16 NEW cov: 10790 ft: 15078 corp: 5/147b lim: 90 exec/s: 0 rss: 70Mb L: 9/63 MS: 2 ChangeByte-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:30.846 [2024-11-30 00:06:56.340337] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.846 [2024-11-30 00:06:56.340371] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.105 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.105 #17 NEW cov: 10807 ft: 15357 corp: 6/164b lim: 90 exec/s: 0 rss: 71Mb L: 17/63 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.105 [2024-11-30 00:06:56.454297] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.105 [2024-11-30 00:06:56.454330] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.105 #18 NEW cov: 10807 ft: 15426 corp: 7/209b lim: 90 exec/s: 18 rss: 71Mb L: 45/63 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.105 [2024-11-30 00:06:56.566050] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.105 [2024-11-30 00:06:56.566085] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.105 #19 NEW cov: 10807 ft: 15520 corp: 8/292b lim: 90 exec/s: 19 rss: 71Mb L: 83/83 MS: 1 CrossOver- 00:08:31.364 [2024-11-30 00:06:56.679938] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.364 [2024-11-30 00:06:56.679973] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.364 #21 NEW cov: 10807 ft: 16260 corp: 9/305b lim: 90 exec/s: 21 rss: 71Mb L: 13/83 MS: 2 CopyPart-CrossOver- 00:08:31.364 [2024-11-30 00:06:56.791704] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.364 [2024-11-30 00:06:56.791738] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.364 #22 NEW cov: 10807 ft: 16575 corp: 10/388b lim: 90 exec/s: 22 rss: 71Mb L: 83/83 MS: 1 ShuffleBytes- 00:08:31.364 [2024-11-30 00:06:56.904434] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.364 [2024-11-30 00:06:56.904468] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.623 #23 NEW cov: 10807 ft: 16897 corp: 11/470b lim: 90 exec/s: 23 rss: 71Mb L: 82/83 MS: 1 InsertRepeatedBytes- 00:08:31.623 [2024-11-30 00:06:57.019264] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.623 [2024-11-30 00:06:57.019298] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.623 #24 NEW cov: 10807 ft: 17326 corp: 12/483b lim: 90 exec/s: 24 rss: 71Mb L: 13/83 MS: 1 ChangeBinInt- 00:08:31.623 [2024-11-30 00:06:57.133154] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.623 [2024-11-30 00:06:57.133189] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.882 #25 NEW cov: 10807 ft: 17386 corp: 13/500b lim: 90 exec/s: 25 rss: 71Mb L: 17/83 MS: 1 ChangeBit- 00:08:31.882 [2024-11-30 00:06:57.244932] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.882 [2024-11-30 00:06:57.244966] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.882 #26 NEW cov: 10814 ft: 17437 corp: 14/514b lim: 90 exec/s: 26 rss: 71Mb L: 14/83 MS: 1 InsertByte- 00:08:31.882 [2024-11-30 00:06:57.356882] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.882 [2024-11-30 00:06:57.356917] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.882 #27 NEW cov: 10814 ft: 17734 corp: 15/524b lim: 90 exec/s: 27 rss: 71Mb L: 10/83 MS: 1 InsertByte- 00:08:32.141 [2024-11-30 00:06:57.470643] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.141 [2024-11-30 00:06:57.470677] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.141 #28 NEW cov: 10814 ft: 17763 corp: 16/607b lim: 90 exec/s: 14 rss: 71Mb L: 83/83 MS: 1 ShuffleBytes- 00:08:32.141 #28 DONE cov: 10814 ft: 17763 corp: 16/607b lim: 90 exec/s: 14 rss: 71Mb 00:08:32.141 ###### Recommended dictionary. ###### 00:08:32.141 "\000\000\000\000\000\000\000\000" # Uses: 2 00:08:32.141 ###### End of recommended dictionary. ###### 00:08:32.141 Done 28 runs in 2 second(s) 00:08:32.399 00:06:57 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:32.399 00:06:57 -- ../common.sh@72 -- # (( i++ )) 00:08:32.399 00:06:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.399 00:06:57 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:32.399 00:08:32.399 real 0m19.577s 00:08:32.399 user 0m27.491s 00:08:32.399 sys 0m1.826s 00:08:32.399 00:06:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:32.399 00:06:57 -- common/autotest_common.sh@10 -- # set +x 00:08:32.399 ************************************ 00:08:32.399 END TEST vfio_fuzz 00:08:32.399 ************************************ 00:08:32.399 00:08:32.399 real 1m25.311s 00:08:32.399 user 2m8.390s 00:08:32.399 sys 0m10.267s 00:08:32.399 00:06:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:32.399 00:06:57 -- common/autotest_common.sh@10 -- # set +x 00:08:32.399 ************************************ 00:08:32.399 END TEST llvm_fuzz 00:08:32.399 ************************************ 00:08:32.399 00:06:57 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:32.399 00:06:57 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:32.399 00:06:57 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:32.399 00:06:57 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:32.399 00:06:57 -- common/autotest_common.sh@10 -- # set +x 00:08:32.399 00:06:57 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:32.399 00:06:57 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:32.399 00:06:57 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:32.399 00:06:57 -- common/autotest_common.sh@10 -- # set +x 00:08:39.006 INFO: APP EXITING 00:08:39.006 INFO: killing all VMs 00:08:39.006 INFO: killing vhost app 00:08:39.006 INFO: EXIT DONE 00:08:42.294 Waiting for block devices as requested 00:08:42.294 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:42.294 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:42.294 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:42.294 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:42.294 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:42.553 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:42.553 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:42.553 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:42.813 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:42.813 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:42.813 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:43.072 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:43.072 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:43.072 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:43.330 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:43.330 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:43.330 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:47.524 Cleaning 00:08:47.524 Removing: /dev/shm/spdk_tgt_trace.pid2697284 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2694796 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2696072 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2697284 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2698093 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2698419 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2698754 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2699101 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2699434 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2699723 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2700006 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2700329 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2701194 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2704399 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2704705 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2705021 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2705271 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2705844 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2706051 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2706433 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2706697 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2706989 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2707009 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2707301 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2707450 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2707954 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2708244 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2708526 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2708632 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2708908 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2709057 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2709238 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2709510 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2709745 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2709924 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2710130 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2710377 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2710660 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2710926 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2711217 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2711486 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2711753 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2711925 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2712123 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2712344 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2712639 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2712908 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2713189 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2713458 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2713745 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2713913 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2714117 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2714326 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2714629 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2714897 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2715186 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2715455 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2715738 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2715935 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2716125 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2716313 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2716599 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2716877 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2717160 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2717429 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2717720 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2717902 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2718126 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2718308 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2718589 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2718857 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2719147 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2719297 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2719557 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2720311 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2720781 00:08:47.524 Removing: /var/run/dpdk/spdk_pid2721148 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2721695 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2722235 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2722563 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2723071 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2723620 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2724040 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2724455 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2724999 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2725643 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2726241 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2726929 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2727477 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2727905 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2728310 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2728847 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2729332 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2729688 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2730232 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2730648 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2731069 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2731606 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2731985 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2732538 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2733066 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2733613 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2734160 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2734521 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2735004 00:08:47.525 Removing: /var/run/dpdk/spdk_pid2735549 00:08:47.525 Clean 00:08:47.525 killing process with pid 2647425 00:08:51.731 killing process with pid 2647422 00:08:51.731 killing process with pid 2647424 00:08:51.731 killing process with pid 2647423 00:08:51.731 00:07:16 -- common/autotest_common.sh@1446 -- # return 0 00:08:51.731 00:07:16 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:08:51.731 00:07:16 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:51.731 00:07:16 -- common/autotest_common.sh@10 -- # set +x 00:08:51.731 00:07:16 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:08:51.731 00:07:16 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:51.731 00:07:16 -- common/autotest_common.sh@10 -- # set +x 00:08:51.731 00:07:16 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:51.731 00:07:16 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:51.731 00:07:16 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:51.731 00:07:16 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:08:51.731 00:07:16 -- spdk/autotest.sh@383 -- # hostname 00:08:51.731 00:07:16 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:08:51.731 geninfo: WARNING: invalid characters removed from testname! 00:08:52.299 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:08:52.299 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:08:52.299 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:02.274 00:07:27 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:10.401 00:07:34 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:13.686 00:07:39 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:18.957 00:07:43 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:23.153 00:07:48 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:28.428 00:07:53 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:32.723 00:07:57 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:32.723 00:07:57 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:32.723 00:07:57 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:32.723 00:07:57 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:32.723 00:07:58 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:32.723 00:07:58 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:32.723 00:07:58 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:32.723 00:07:58 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:32.723 00:07:58 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:32.723 00:07:58 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:32.723 00:07:58 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:32.723 00:07:58 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:32.723 00:07:58 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:32.723 00:07:58 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:32.723 00:07:58 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:32.723 00:07:58 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:32.723 00:07:58 -- scripts/common.sh@343 -- $ case "$op" in 00:09:32.723 00:07:58 -- scripts/common.sh@344 -- $ : 1 00:09:32.723 00:07:58 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:32.723 00:07:58 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:32.723 00:07:58 -- scripts/common.sh@364 -- $ decimal 1 00:09:32.723 00:07:58 -- scripts/common.sh@352 -- $ local d=1 00:09:32.723 00:07:58 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:32.723 00:07:58 -- scripts/common.sh@354 -- $ echo 1 00:09:32.723 00:07:58 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:32.723 00:07:58 -- scripts/common.sh@365 -- $ decimal 2 00:09:32.723 00:07:58 -- scripts/common.sh@352 -- $ local d=2 00:09:32.723 00:07:58 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:32.723 00:07:58 -- scripts/common.sh@354 -- $ echo 2 00:09:32.723 00:07:58 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:32.723 00:07:58 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:32.723 00:07:58 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:32.723 00:07:58 -- scripts/common.sh@367 -- $ return 0 00:09:32.723 00:07:58 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:32.723 00:07:58 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:32.723 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:32.723 --rc genhtml_branch_coverage=1 00:09:32.723 --rc genhtml_function_coverage=1 00:09:32.723 --rc genhtml_legend=1 00:09:32.723 --rc geninfo_all_blocks=1 00:09:32.723 --rc geninfo_unexecuted_blocks=1 00:09:32.723 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:32.723 ' 00:09:32.723 00:07:58 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:32.723 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:32.723 --rc genhtml_branch_coverage=1 00:09:32.723 --rc genhtml_function_coverage=1 00:09:32.723 --rc genhtml_legend=1 00:09:32.723 --rc geninfo_all_blocks=1 00:09:32.723 --rc geninfo_unexecuted_blocks=1 00:09:32.723 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:32.723 ' 00:09:32.723 00:07:58 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:32.723 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:32.723 --rc genhtml_branch_coverage=1 00:09:32.723 --rc genhtml_function_coverage=1 00:09:32.723 --rc genhtml_legend=1 00:09:32.723 --rc geninfo_all_blocks=1 00:09:32.723 --rc geninfo_unexecuted_blocks=1 00:09:32.723 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:32.723 ' 00:09:32.723 00:07:58 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:32.723 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:32.723 --rc genhtml_branch_coverage=1 00:09:32.723 --rc genhtml_function_coverage=1 00:09:32.723 --rc genhtml_legend=1 00:09:32.723 --rc geninfo_all_blocks=1 00:09:32.723 --rc geninfo_unexecuted_blocks=1 00:09:32.723 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:32.723 ' 00:09:32.723 00:07:58 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:32.723 00:07:58 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:32.723 00:07:58 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:32.723 00:07:58 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:32.723 00:07:58 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:32.723 00:07:58 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:32.723 00:07:58 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:32.723 00:07:58 -- paths/export.sh@5 -- $ export PATH 00:09:32.723 00:07:58 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:32.723 00:07:58 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:32.723 00:07:58 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:32.723 00:07:58 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732921678.XXXXXX 00:09:32.723 00:07:58 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732921678.bcIude 00:09:32.723 00:07:58 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:32.723 00:07:58 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:09:32.723 00:07:58 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:32.723 00:07:58 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:32.723 00:07:58 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:32.723 00:07:58 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:32.723 00:07:58 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:32.723 00:07:58 -- common/autotest_common.sh@10 -- $ set +x 00:09:32.723 00:07:58 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:32.723 00:07:58 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:32.723 00:07:58 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:32.723 00:07:58 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:32.723 00:07:58 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:32.723 00:07:58 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:32.723 00:07:58 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:32.723 00:07:58 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:32.723 00:07:58 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:32.723 00:07:58 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:32.723 00:07:58 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:32.723 + [[ -n 2603456 ]] 00:09:32.723 + sudo kill 2603456 00:09:32.733 [Pipeline] } 00:09:32.749 [Pipeline] // stage 00:09:32.754 [Pipeline] } 00:09:32.768 [Pipeline] // timeout 00:09:32.773 [Pipeline] } 00:09:32.788 [Pipeline] // catchError 00:09:32.793 [Pipeline] } 00:09:32.808 [Pipeline] // wrap 00:09:32.814 [Pipeline] } 00:09:32.827 [Pipeline] // catchError 00:09:32.836 [Pipeline] stage 00:09:32.838 [Pipeline] { (Epilogue) 00:09:32.851 [Pipeline] catchError 00:09:32.853 [Pipeline] { 00:09:32.865 [Pipeline] echo 00:09:32.867 Cleanup processes 00:09:32.873 [Pipeline] sh 00:09:33.158 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:33.158 2745061 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:33.172 [Pipeline] sh 00:09:33.455 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:33.455 ++ grep -v 'sudo pgrep' 00:09:33.455 ++ awk '{print $1}' 00:09:33.455 + sudo kill -9 00:09:33.455 + true 00:09:33.465 [Pipeline] sh 00:09:33.744 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:33.744 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:33.744 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:35.121 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:45.103 [Pipeline] sh 00:09:45.386 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:45.386 Artifacts sizes are good 00:09:45.400 [Pipeline] archiveArtifacts 00:09:45.407 Archiving artifacts 00:09:45.548 [Pipeline] sh 00:09:45.834 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:45.849 [Pipeline] cleanWs 00:09:45.859 [WS-CLEANUP] Deleting project workspace... 00:09:45.859 [WS-CLEANUP] Deferred wipeout is used... 00:09:45.866 [WS-CLEANUP] done 00:09:45.867 [Pipeline] } 00:09:45.885 [Pipeline] // catchError 00:09:45.898 [Pipeline] sh 00:09:46.185 + logger -p user.info -t JENKINS-CI 00:09:46.195 [Pipeline] } 00:09:46.211 [Pipeline] // stage 00:09:46.217 [Pipeline] } 00:09:46.234 [Pipeline] // node 00:09:46.241 [Pipeline] End of Pipeline 00:09:46.277 Finished: SUCCESS